Here’s how Microsoft’s AI is helping machines translate Indian languages
Microsoft’s Translator app uses deep neural network models to offer real-time translation in Hindi, Bengali, and Tamil.tech Updated: Jan 25, 2018 15:58 IST
Although India is the second largest Internet market, much of the population in the country is yet to come online. One of the factors that continues to shoo them off the world wide web is the huge language barrier. Most Indians don’t understand English, and much of the web and apps made available to them is not available in their preferred local languages.
This is the sad reality of emerging markets where the literacy rate is often poor and local infrastructure is not in place to bridge that gap. It has also emerged as a problem for many firms, including Silicon Valley companies that are searching for their next billion users in places like India. But some are also using it as an opportunity to showcase their engineering talent.
Microsoft announced on Thursday that all the services that are powered by its translation service -- including Bing search, Bing Translator website, as well Microsoft Office 365 products like Word, Excel, PowerPoint, Outlook, and Skype -- can now offer improved real-time translations in Hindi, Bengali, and Tamil languages. The company’s Translator app has been updated to recognise and translate languages from text, speech, and also pick up things from a photo, the company said.
The new update should come in handy to potentially hundreds of millions of Indians. There are about 521 million Hindi speakers in India, 101 million Bengali speakers, and 75 million Tamil speakers, according to a study by research firm KPMG. The firm estimated that at current pace, only about 38 percent of that Hindi audience, eight percent of Bengali speaking mass, and just six percent of people who prefer Tamil would come online by 2021.
Krishna Doss Mohan, senior program manager at Microsoft India, estimated about 20% improvement in translation quality for the Indic languages. In our brief testing, we were pleasantly surprised by the translation as well.
There is an explanation. Mohan says the improvement can be credited to the company’s deep neural network advancements. Microsoft began exploring the red hot engineering area years ago, and first introduced it in Microsoft Translator app for select languages in 2016. ALSO READ: Microsoft makes its Bing search, Office 365 AI-smart; updates popular ‘Seeing AI’ app
Behind the scene action
Microsoft previously relied on what it describes as “Statistical Machine Translation (SMT)” model for translating sentences from one language to another. But the company found that the model had its limitations -- it struggled to make sense of words in their local context and their dynamics with other words. Think of this as filler words such as umm, hmm, aaa that are common in modern conversations. A translator, unless instructed otherwise, doesn’t know what these filler words mean.
“Adding to the complexity are the subtle differences in enunciation, accent, diction, and slang across various regions in India,” the company said. “For example, two native Hindi speakers from different regions of a Hindi-speaking belt may have divergent ways of constructing a sentence or describing the same thing.”
Traditional statistical machine translation require a trove of data to improve themselves at making sense of the corresponding languages. This was another challenge for Microsoft when it was working with Indic languages, it said. Most Indian languages are underrepresented in online conversations, and there is a lack of unique parallel pairs of data for Indian languages.
This is where the deep learning neural network models come into play. The model looks at established theories about pattern recognition to mimic the way a human brain works and attempts to translate sentences in a similar fashion. Thus, Microsoft says its deep neural network model is better at understanding granular concepts such as gender, and nature and tone of the words in a sentence.
The new model first recognises spoken words and converts them into text. It then eliminates incomprehensible words and repetitions from the source text. Once done, it begins to eliminate errors, which it has learned by looking at database of translated sentences, before finalising the resultant text. The last step involves synthesising the text-based translation into speech. You can give it a try by speech on company’s Translator app.
Microsoft says its partners and customers can also make use of the new feature through APIs on Azure and can integrate into their products. Developers can access the company’s Cognitive Services APIs that are based on this model as well to develop apps in foreign languages and integrate this model into their app’s functionality. Microsoft Edge users can make use of this on any website they visit with the browser. The DNN-powered translation models for Hindi and Bengali are currently live, and it would soon support Tamil.