Latent Dirichlet Allocation
Introduction
With the increase of the large Data, different learning methods to automate data analysis has also been increasing. One of the hot topics is Topic Modelling in Data Analysis in NLP. LDA is one of the most commonly used Topic Modelling algorithm, developed by David Blei, Andrew Ng, and Michael Jordon. It is a kind of generative model which focuses on the information retreival part. It is also considered as the dimensionality reduction technique.
The Basic idea of LDA is that, each Document is a mixture of latent topics and each topic is distributed over the words. Given the corpus, LDA tries to discover the following:
- The set of topics.
- The set of words.
- Distribution of the topics with each document.
Figure1: Plate notation |
Figure 1 shown above is the plate notation for LDA. Outer plate represent the documents, whereas inner plate represents the choice of topics and words within a document. M stands for number of documents and N stand for number of words in the document. Θ denotes the topic distribution for document.α is the parameter of Dirichlet prior on the per-document topic distribution whereas β is the Dirichlet prior per topic word distribution. z denotes the topic and w denotes the word here.
While applying LDA, you need to choose that how many topics and how many words in each topic you want. Let us look at the example of LDA.
- I like to eat broccoli and bananas.
- I ate a banana and spinach smoothie for breakfast.
- Chinchillas and kittens are cute.
- My sister adopted a kitten yesterday.
- Look at this cute hamster munching on a piece of broccoli.
Consider them as five different documents and you want to generate topics for these documents. Applying LDA(Latent Dirichlet Allocation) on these documents will produce the following results.
- Sentences 1 and 2: 100% Topic A
- Sentences 3 and 4: 100% Topic B
- Sentence 5: 60% Topic A, 40% Topic B
Topic B: 20% chinchillas, 20% kittens, 20% cute, 15% hamster
Now from topics generated you can interpret that the topics A is related to food and Topic B is related to the animals.
LDA is a bag-of-words model. It can be very useful in knowing the general theme of documents. It is able to generalize the model it uses to separate documents into topics outside the corpora. LDA is often used in recommendation systems, document classification, data exploration and document summarization etc.
WorkFlow
In the First step data preprocessing is done where all the stopwords are removed from the documents and also the unnecessary words. Then stemming is done for all the words before giving it as an input to the LDA. Now we have a corpus of words which are all stemmed. For learning , LDA model mainly looks at the frequency of the words in the corpus. Each word in the corpus is considered as the independent of other words. This is often known as Bag of words approach in data analysis part.
Number of words, number of topics, and the documents is an input to the LDA. LDA assigns each word of document to a particular topic and calculates the probability using P(word|topic)*P(Topic|documnent). This probability is calculated again by assigning the word to all the topics. In the end that word is assigned to the topic based on the probability it calculated. This process is repeated for each word in the document. This is mainly one iteration. If we increase the number of iterations, more accurate will be the results.
Interpretation of results
After running LDA on the corpus, it will mainly assign the most frequent words to each of the topics. Each document will be assigned with a topic or a mixture of topics as shown in figure1 and each word in the topic will have some contribution to that particular topic.
References
- https://edlab.tc.columbia.edu/blog/13139-Topic-Modeling-with-LDA-in-NLP-data-mining-in-Pressible
- https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5028368/
- "Statistical Topic Modelling for news articles" by Suganya C, and Vijaya M S.
- "Latent Diriclet Alloation" by David M. Blei, Andrew Y. Ng, and Michael I. Jordan.
- http://pythonhosted.org/trustedanalytics/LdaNewPlugin_Summary.html
- http://blog.echen.me/2011/08/22/introduction-to-latent-dirichlet-allocation/
Comments
Post a Comment