Google's AI just created its own universal 'language'

Google's AI just created its own universal 'language'

The technology used in Google Translate can identify hidden material between languages to create what's known as interlingua

By MATT BURGESS Wednesday 23 November 2016

Google has previously taught its artificial intelligence to play games, and it's even capable of creating its own encryption. Now, its language translation tool has used machine learning to create a 'language' all of its own.

In September, the search giant turned on its Google Neural Machine Translation (GNMT) system to help it automatically improve how it translates languages. The machine learning system analyses and makes sense of languages by looking at entire sentences – rather than individual phrases or words.

Following several months of testing, the researchers behind the AI have seen it be able to blindly translate languages even if it's never studied one of the languages involved in the translation. "An example of this would be translations between Korean and Japanese where KoreanJapanese examples were not shown to the system," the Mike Schuster, from Google Brain wrote in a blogpost.

The team said the system was able to make "reasonable" translations of the languages it had not been taught to translate. In one instance, a research paper published alongside the blog, says the AI was taught Portuguese→English and English→Spanish. It was then able to make translations between Portuguese→Spanish.

"To our knowledge, this is the first demonstration of true multilingual zero-shot translation," the paper explains. To make the system more accurate, the computer scientists then added additional data to the system about the languages.

However, the most remarkable feat of the research paper isn't that an AI can learn to translate languages without being shown examples of them first; it was the fact it used this skill to create its own 'language'. "Visual interpretation of the results shows that these models learn a form of interlingua representation for the multilingual model between all involved language pairs," the researchers wrote in the paper.

An interlingua is a type of artificial language, which is used to fulfil a purpose. In this case, the interlingua was used within the AI to explain how unseen material could be translated.

"Using a 3-dimensional representation of internal network data, we were able to take a peek into the system as it translated a set of sentences between all possible pairs of the Japanese, Korean, and English languages," the team's blogpost continued. The data within the network allowed the team to interpret that the neural network was "encoding something" about the semantics of a sentence rather than comparing phrase-to-phrase translations.

"We interpret this as a sign of existence of an interlingua in the network," the team said. As a result of the work, the Multilingual Google Neural Machine Translation is now being used across all of Google Translate and the firm said multilingual systems are involved in the translation of 10 of the 16 newest language pairs.

The research from the Google Brain team follows its recent work that taught AI to create a form of encryption. In a research paper published online, the scientists created three neural networks: Alice, Bob, and Eve. Each of the networks was given its own job. One to create encryption, one to receive it and decode a message, and the final to attempt to decrypt the message without having encryption keys.

After training, the AIs were able to convert plain text messages into encrypted messages, using its own form of encryption and then decode the messages.


Comments

Popular posts from this blog

Report: World’s 1st remote brain surgery via 5G network performed in China

Visualizing The Power Of The World's Supercomputers

BMW traps alleged thief by remotely locking him in car