25.8.20
This website uses cookies to ensure you get the best experience on our website. Learn more

Fundamentals of NLP: Word Embeddings to Capture Relationships in Text

Skillsoft issued completion badges are earned based on viewing the percentage required or receiving a passing score when assessment is required. Before training any text-based machine learning model, it is necessary to encode that text into a machine-readable numeric form. Embeddings are the preferred way to encode text, as they capture data about the meaning of text, and are performant even with large vocabularies. You will start this course by working with Word2Vec embeddings, which represent words and terms in feature vector space, capturing the meaning and context of a word in a sentence. You will generate Word2Vec embeddings on your data corpus, set up a Gaussian Naïve-Bayes classification model, and train it on Word2Vec embeddings. Next, you will move on to GloVe embeddings. You will use the pre-trained GloVe word vector embeddings and explore how to view similar words and identify the odd one out in a set. Finally, you will perform classification using many different models, including Naive-Bayes and Random Forest models.

Issued on

February 6, 2025

Expires on

Does not expire