Skip to main content

Machine Translation

                         Machine Translation

What is machine translation?

Translation of text from one language to another without human involvement. Also called automated translation "Translation performed by a computer".

Why?

Communication is our life.Without communication we can't express our feelings, share our thoughts.But there are about 6909 languages in the world and I don't think anyone in the world knows more than 10 languages.So to understand others' language machine translation is required which converts one language to another language with the use of software.It  also helps in translating web content and web pages.

Types of Translation

1.Machine Translation

Sometimes the general meaning of a text is all you need from your translation. Machine translation  provides the perfect combination of rapid, trusted and cost-effective translations when getting the general meaning across is sufficient.

2.Community Translation

Translating with community users provides accuracy for lower cost and reasonable speed. Community translation is less expensive, but may not provide the level of quality and consistency offered by professional translators. Use Community Translation for knowledge base articles, video subtitles, simple web pages, Wiki entries, and online newspapers.

3.Professional Translation

Professional translation offers the highest quality and includes additional levels of review to ensure accuracy. Use Professional Translation for very important content like press releases, brochures, and white papers, to name a few items.
 
Content Value Index

Types of machine translation system

  1. Rules Based Systems uses a combination of language and its grammatical rules.This system relies on many built-in linguistic rules and bilingual dictionaries for each language pair. The software parses text and creates a intermediate representation from which the text in the target language is generated. This process requires lexicons with  syntactic,morphological and semantic information, and large sets of rules.  But it is hard to handle exceptions to rules. 
  2. Statistical systems: These systems does not depend on language rules instead they relies on existing multilingual corpora.They learn to translate by analysing large amount of data for each language pair.Building statistical translation models is a quick but minimum of 2 million words for a specific domain and even more for general language are required.Thats why these systems have high CPU and disk space requirements.
  3. Neural Machine Translation (NMT) is a new approach that makes machines learn to translate through one large neural network.In this a bidirectional  recurrent neural network(RNN) is used which includes encoder to encode source sentence for second RNN, known as decoder which predict the words in target language.
The approach has become increasingly popular amongst MT researchers and developers, as trained NMT systems have started to show better translation performance in many language pairs compared to the phrase-based statistical approach.
*Google also uses NMT now named as Google Neural Machine Translation that uses artificial neural network.                                      

References:

  • https://leonardoaraujosantos.gitbooks.io/artificial-inteligence/content/recurrent_neural_networks/machine-translation-using-rnn.html
  • https://en.wikipedia.org/wiki/Google_Neural_Machine_Translation
  • https://en.wikipedia.org/wiki/Statistical_machine_translation
  • https://en.wikipedia.org/wiki/Rule-based_machine_translation
  • https://cs224d.stanford.edu/reports/GreensteinEric.pdf
  • https://leonardoaraujosantos.gitbooks.io/artificial-inteligence/content/recurrent_neural_networks/machine-translation-using-rnn.html

Comments

Popular posts from this blog

NLP in Video Games

From the last few decades, NLP (Natural Language Processing) has obtained a high level of success in the field  of Computer Science, Artificial Intelligence and Computational Logistics. NLP can also be used in video games, in fact, it is very interesting to use NLP in video games, as we can see games like Serious Games includes Communication aspects. In video games, the communication includes linguistic information that is passed either through spoken content or written content. Now the question is why and where can we use NLP in video games?  There are some games that are related to pedagogy or teaching (Serious Games). So, NLP can be used in these games to achieve these objectives in the real sense. In other games, one can use the speech control using NLP so that the player can play the game by concentrating only on visuals rather on I/O. These things at last increases the realism of the game. Hence, this is the reason for using NLP in games.  We ...

Discourse Analysis

NLP makes machine to understand human language but we are facing issues like word ambiguity, sarcastic sentiments analysis and many more. One of the issue is to predict correctly relation between words like " Patrick went to the club on last Friday. He met Richard ." Here, ' He' refers to 'Patrick'. This kind of issue makes Discourse analysis one of the important applications of Natural Language Processing. What is Discourse Analysis ? The word discourse in linguistic terms means language in use. Discourse analysis may be defined as the process of performing text or language analysis, which involves text interpretation and knowing the social interactions. Discourse analysis may involve dealing with morphemes, n-grams, tenses, verbal aspects, page layouts, and so on. It is often used to refer to the analysis of conversations or verbal discourse. It is useful for performing tasks, like A naphora Resolution (AR) , Named Entity Recognition (NE...

Dbpedia Datasets

WHAT IS Dbpedia? It is a project idea aiming to extract structured content from the information created in the wikipedia project. This structured information is made available on the World Wide Web. DBpedia allows users to semantically query relationships and properties of Wikipedia resources, including links to other related datsets. BUT? But why i am talking about Dbpedia ? How it is related to natural language processing? The DBpedia data set contains 4.58 million entities, out of which 4.22 million are classified in a consistent ontology, including 1,445,000 persons, 735,000 places, 123,000 music albums, 87,000 films, 19,000 video games, 241,000 organizations, 251,000 species and 6,000 diseases. The data set features labels and abstracts for these entities in up to 125 languages; 25.2 million links to images and 29.8 million links to external web pages. In addition, it contains around 50 million links...