We all grow up reading stories, some of which even form the basis of our etiquette. They help us enjoy and relieve stress.
The automation of stories' generation has been a long researched problem. Major problem arises because neither the inputs nor the features of the output are defined clearly and thus evaluation of the output story becomes problematic. How do you say your algorithm performs well or not if you can't judge the output. How do you make an algorithm for generating the story when the result desired is not clearly defined?
Many techniques have been used to make such a system like planning approach where start state of characters, world, etc. and goals - character goals or author goals were given and story was generated by creating a plan to reach end state from start state to reach the goal, case based approach where a database had all the previous stories and new story was generated using preexisting ontology, recurrent neural network approach by training the model on stories of a similar kind and using the model to predict the characters, words and sentences that follow by assigning probabilities.
We focus on the open story generation which requires to tell a story given any domain without any retraining of the model. The latest technique used to generate story is via Markov Chain Monte Carlo (MCMC) and Deep Neural Networks. The problem of open story generation is broken down into two sub-problems:
Many techniques have been used to make such a system like planning approach where start state of characters, world, etc. and goals - character goals or author goals were given and story was generated by creating a plan to reach end state from start state to reach the goal, case based approach where a database had all the previous stories and new story was generated using preexisting ontology, recurrent neural network approach by training the model on stories of a similar kind and using the model to predict the characters, words and sentences that follow by assigning probabilities.
We focus on the open story generation which requires to tell a story given any domain without any retraining of the model. The latest technique used to generate story is via Markov Chain Monte Carlo (MCMC) and Deep Neural Networks. The problem of open story generation is broken down into two sub-problems:
- Generating a successor event.
- Translating events back into human readable natural language.
Example: A noun can be subject or object depending on its position and thus next word can verb or sentence end depending on it. So while deciding the word we see both the location and the type of the previous word while deciding the next word.
For the example given, if we know noun at position 0, we prefer verb at position 1.
Two acceptance criteria are taken which are multiplied to score the distribution:
- Event Succession: Events sequence is important as certain events are more likely to follow a particular event than other events. A sequence to sequence neural network is created to learn model of event succession. This contains two neural networks- encoder and decoder. They combined give probabilities of event that succeed the given event. For the story this all is summed to generate a score for the story.
- Long Range Event Relationships: The event succession model helps in capturing which sentence should come after which sentence but it misses the long distance relationship between various parts in the story. For this, k verbs are taken and the story is scored by looking into whether the verbs come in order. The k verbs can be any and are given according to human intuition. If the verbs are not in order, the score is 0 and hence multiplied score is also 0 else score is 1. In place of using human intuition, we learn the sequence of words using skipping Recurrent Neural networks on a corpus of stories to automate the process to get k most important events in the story.
References
https://www.cc.gatech.edu/~riedl/pubs/int17.pdf
http://dspace.sliit.lk/bitstream/123456789/271/1/An%20Ontology%20Based%20Natural%20Language%20Story%20Generation%20Approach%20-%20WELOTA.pdf
http://dspace.sliit.lk/bitstream/123456789/271/1/An%20Ontology%20Based%20Natural%20Language%20Story%20Generation%20Approach%20-%20WELOTA.pdf
Comments
Post a Comment