Thursday, 20 October, 2022 UTC


Summary

Meta has developed an artificially intelligent translation system that can convert an oral language — Hokkien — to spoken English. It's another step closer to finally making Star Trek's Universal Translator a reality.
CEO Mark Zuckerberg shared a video demonstrating the tech alongside software engineer Peng-Jen Chen in a Facebook post on Wednesday. In it, the two converse in English and Hokkien respectively, with Meta's AI system audibly translating. The demonstration appears fairly impressive, though like Meta's VR legs, it's highly likely the video was edited for illustrative purposes and the current product isn't quite as smooth.
Translation AI is typically trained on text, with researchers feeding their systems reams of written words for them to learn from. However, there are over 3,000 languages that are primarily spoken and have no widely used written system, making them difficult to incorporate into such training.
Hokkien is one such language. Used by over 45 million people across mainland China, Taiwan, Malaysia, Singapore, and the Philippines, Hokkien is an oral language without an official, standard written system.
As such, Hokkien speakers who need to write down information tend to do so phonetically, resulting in significant variation depending on the writer. There is very little recorded data translating Hokkien to English either, and professional human translators are scarce.
SEE ALSO: 'Squid Game' and the ‘untranslatable’: The debate around subtitles explained
To work around this, Meta used written Mandarin as an intermediary between English and Hokkien when training its AI. 
"Our team first translated English or Hokkien speech to Mandarin text, and then translated it to Hokkien or English — both with human annotators and automatically," Meta researcher Juan Pino said. "They then added the paired sentences to the data used to train the AI model."
Tweet may have been deleted (opens in a new tab)
Of course, filtering a sentence through multiple languages can sometimes distort its meaning — as anyone who has ever played around with Google Translate knows. Meta also worked with Hokkien speakers to check translations, and is releasing its models, data, and research as open-source information for other researchers to utilise.
Mashable has reached out to Meta for comment.
We're still quite a way off from having a fully functioning TARDIS Translation Matrix. Meta's oral translation system can currently only handle one sentence at a time, and only works with translations between Hokkien and English. Even so, it's promising progress toward the real-time oral translation Meta is aiming for.
Meta's Hokkien-English oral translation project is part of Meta's ongoing Universal Speech Translator program, which aims to be able to translate speech between two languages as they're being spoken.