Facebook M2M-100 Model Helps you Translate

M2M-100 is the first Multilingual Translation Model developed by Facebook to make communications more reliable. Now you don’t need to rely on English data to talk with your French friend. Yes, Facebook AI is launching M2M-100, the first multilingual machine translation (MMT) model that will able to translate any language without relying on English data. […]

Facebook M2M-100 Model

M2M-100 is the first Multilingual Translation Model developed by Facebook to make communications more reliable.

Now you don’t need to rely on English data to talk with your French friend. Yes, Facebook AI is launching M2M-100, the first multilingual machine translation (MMT) model that will able to translate any language without relying on English data.

Facebook is ready to bring M2M-100 to give a more enhanced and enriched user experience than ever. This MMT model will help 2 billion users of Facebook to communicate effectively and easily without relying on English data.

What is Facebook M2M-100 Model?

M2M-100 Model or also known as the Multilingual Machine Translation Model will translate almost 100 languages without English’s data interference. It is going to happen for the very first time that any AI model is going to translate any pair of languages without relying on English.

Why does Facebook decide to bring the M2M-100 Model?

As other models that are out in the market and translating data based on English data, the M2M-100 or Multilingual Machine Translation Model will be the first model that is trained on 2,200 language directions to give a rich user experience to the visitors.

Facebook achieved this milestone through years of struggles

Struggling through years and years, facing failures and disasters, Facebook has now achieved this milestone. Now, this AI communication model will help Facebook to better understand the communication between its community of 2 Billion users across the globe without any help of English data.

More from M2M-100 Model

M2M -100 Model is considered to be more accurate than other translation models because it doesn’t have to use English as an intermediary.

Usually, the models are centered in English. So translating Chinese to French or Chinese to Spanish would require translation into English before the final destination.

BLEU

Facebook argues that direct translations between languages ​​capture more meaning and outperform English-centered systems by ten points on the Bilingual Evaluation Understudy (BLEU) score.

M2M-100 can understand 2,200 language directions. As announced by Facebook, it will also release the model, training, and evaluation facility for M2M-100 to other researchers.

How does Facebook AI describe the process?

Facebook AI describes the process as follows: “To connect the languages ​​of the different groups with one another, we have identified a small number of bridging languages, which are usually one to three main languages ​​in each group.

In the example above, Hindi, Bengali, and Tamil would be bridging languages ​​for Indo-Aryan languages. Then we determined parallel training data for all possible combinations of these bridging languages. With this technology, our training data set resulted in 7.5 billion parallel data sets that corresponded to 2,200 directions. “

20 Billion Translations in the Facebook News Feed Everyday.

Facebook has used machine translation on its network, but creating separate AI models for each language and task has not been sold. After all, Facebook carries out 20 billion translations in the Facebook News Feed every day.

To train the MMT model, Facebook had to rewrite high-quality sentence pairs in multiple languages ​​without English. There are more translations into English than there are directly between languages.

Ultimately, Facebook created an MMT dataset of 7.5 billion sentence pairs in 100 languages. From then on, Facebook restricted the pairs to high quality and high data pairs.