Natural Language Processing with Probabilistic Models



Natural Language Processing with Probabilistic Models

Natural Language Processing with Probabilistic Models


In Course 2 of the Natural Language Processing Specialization, you will:a) Create a simple auto-correct algorithm using minimum edit distance and dynamic programming, b) Apply the Viterbi Algorithm for part-of-speech (POS) tagging, which is vital for computational linguistics, c) Write a better auto-complete algorithm using an N-gram language model, and d) Write your own Word2Vec model that uses a neural network to compute word embeddings …

Duration Course 2 of 4 in the
Start your Free Trial

Self paced

48,245 already enrolled

4.4stars Rating out of 5 (1,375 ratings in Coursera)

Go to the Course
We have partnered with providers to bring you collection of courses, When you buy through links on our site, we may earn an affiliate commission from provider.