Facebook unveils 1st multilingual Machine Translation model
Facebook unveils 1st multilingual Machine Translation model
Share:

Leading Social media platform Facebook on Monday launched open-source multilingual machine translation (MMT) model that can translate different languages. It can translate between any pair of 100 languages. Called "M2M-100," it is trained on a total of 2,200 language directions or 10 times more than previous best, English-centric multilingual models.

In a statement, Facebook AI said, "Deploying M2M-100 will improve the quality of translations for billions of people, especially those that speak low-resource languages." When translating, say, Chinese to French, most English-centric multilingual models train on Chinese to English and English to French, because English training data is the most widely available.

The new Facebook ML model directly trains on Chinese to French data to better preserve meaning. It outperforms English-centric systems by 10 points on the widely used BLEU metric for evaluating machine translations. FB announced, " We're also releasing the model, training, and evaluation set up to help other researchers reproduce and further advance multilingual models." Using novel mining strategies to create translation data, Facebook built the first truly "many-to-many" data set with 7.5 billion sentences for 100 languages. The company said that they used several scaling techniques to build a universal model with 15 billion parameters, which captures information from related languages and reflects a more diverse script of languages and morphology.

Also Read: 

Sunteck Realty acquires 50 acres of land for residential project

Samsung's powerful smartphone spotted online

Apple Music launches a TV channel for music videos

Samsung Galaxy M21 received a huge price cut in India, Read details

 

 

Join NewsTrack Whatsapp group
Related News