Skip to main content

Word Embeddings



In this post, I will talk about word embedding and their role in success of many deep learning models in  natural language processing. Other interesting read related to this topic can be found in [1], [2]. 

Word Embedding obtained from English and German

A word embedding is a representation of word W into an n-dimensional vector space (From words to real numbers). 
Suppose there are only five words in our vocabulary king, queen, man, woman, child. Queen can be encoded as shown.
Word embedding helps in efficient and expressive representation of words. Such representation helps in capturing semantic and syntactic similarities and are able to identify relationship between each other in a very simple manner.

Word Embedding capturing the gender relation. Arrows are mathematical vectors denoting the relationship

How are word embedding generated ?
A word embedding matrix J is generated by training an unsupervised algorithm on a very large corpus and then multiplying word W with J . Thankfully, there are many popular models that provide us the matrix J to generate the word embedding. Two such models are Word2vec  and GloVe.

Word2vec is a predictive model. It is implemented in two ways. Continuous Bag Of Words (CBOW) and skip-gram. In CBOW we have a window around some target word and then consider the words around it. We supply those words as input into our network and then use it to try to predict the target word. Skip-gram does the opposite, and try to predict the words that are in the window around that word.


Word2vec archtecture

GloVe is a count-based model that learn their vectors by essentially doing dimensionality reduction on the co-occurrence counts matrix. One can get to know more about them from [3] , [4], [5].

Word embeddings obtained from the previous step are used as input layer layer in CNN, LSTM for various tasks like sentence classification, sentence similarity. Thus, word embeddings have played pivotal role to improve the results of natural processing tasks.

References
[1]https://blog.acolyer.org/2016/04/21/the-amazing-power-of-word-vectors/
[2]http://blog.aylien.com/overview-word-embeddings-history-word2vec-cbow-glove/
[3]http://mccormickml.com/2016/04/27/word2vec-resources/#kaggle-word2vec-tutorial
[4] https://deeplearning4j.org/word2vec.html
[5] https://nlp.stanford.edu/pubs/glove.pdf



Comments

Popular posts from this blog

NLP in Video Games

From the last few decades, NLP (Natural Language Processing) has obtained a high level of success in the field  of Computer Science, Artificial Intelligence and Computational Logistics. NLP can also be used in video games, in fact, it is very interesting to use NLP in video games, as we can see games like Serious Games includes Communication aspects. In video games, the communication includes linguistic information that is passed either through spoken content or written content. Now the question is why and where can we use NLP in video games?  There are some games that are related to pedagogy or teaching (Serious Games). So, NLP can be used in these games to achieve these objectives in the real sense. In other games, one can use the speech control using NLP so that the player can play the game by concentrating only on visuals rather on I/O. These things at last increases the realism of the game. Hence, this is the reason for using NLP in games.  We can use NLP to impr

Discourse Analysis

NLP makes machine to understand human language but we are facing issues like word ambiguity, sarcastic sentiments analysis and many more. One of the issue is to predict correctly relation between words like " Patrick went to the club on last Friday. He met Richard ." Here, ' He' refers to 'Patrick'. This kind of issue makes Discourse analysis one of the important applications of Natural Language Processing. What is Discourse Analysis ? The word discourse in linguistic terms means language in use. Discourse analysis may be defined as the process of performing text or language analysis, which involves text interpretation and knowing the social interactions. Discourse analysis may involve dealing with morphemes, n-grams, tenses, verbal aspects, page layouts, and so on. It is often used to refer to the analysis of conversations or verbal discourse. It is useful for performing tasks, like A naphora Resolution (AR) , Named Entity Recognition (NE

Coreference Resolution and Applications in NLP

In computational linguistics and natural language processing coreference resolution (CR) is an avidly studies problem in discourse which has managed to be only partially solved by the state of the art and consequently remain one of the most exciting open problems in this field. Introduction and Definition The process of linking together mentions of a particular entity in a speech or text excerpt that related to real world entities is termed as coreference resolution. This process identifies the dependence between a phrase with the rest of the sentence or other sentences in the text.  This is an integral part of natural languages to avoid repetition, demonstrate possession/relation etc. A basic example to illustrate the above definition is given below : Another example which uses elements from popular fiction literature : Harry  wouldn’t bother to read “ Hogwarts: A History ” as long as  Hermione  is around.  He  knows  she  knows  the book  by heart. The different type