A Gentle Introduction to Skip-gram (word2vec) Model — AllenNLP ver.

Posted on Sat 02 February 2019 in Word Embeddings • Tagged with Word Embeddings, word2vec, AllenNLP

The Skip-gram model (so called "word2vec") is one of the most important concepts in modern NLP, yet many people simply use its implementation and/or pre-trained embeddings, and few people fully understand how the model is actually built. In this article, I'll cover:

  • What the Skip-gram model is
  • How to …

Continue reading

Improving a Sentiment Analyzer using ELMo — Word Embeddings on Steroids

Posted on Sat 27 October 2018 in Sentiment Analysis • Tagged with Sentiment Analysis, Word Embeddings, ELMo, AllenNLP

In the previous post, I showed how to train a sentiment classifier from the Stanford Sentiment TreeBank. Thanks to a very powerful deep NLP framework, AllenNLP, we were able to write the entire training pipeline in less than 100 lines of Python code.

In this post, I'm going to explain …


Continue reading