The technological revolution continues at pace, bringing us everything from kettles that can be controlled by our phone, watches that can pay grocery bills and cars that drive themselves.
From a language perspective, we now have machine translation and interpreting systems such as Google Translate (enabling anyone with Google Chrome to decipher whole pages of foreign web text) and voice recognition apps such as Lexifone and VoiceTra.
This begs the inevitable question: are multilingual professionals doomed to lose their prestige on the market and become relics of the future’s past, replaced by ever more sophisticated and cheaper automated and AI systems?
The short answer is: probably not for a good few decades at least. Machines offer a useful alternative when there isn’t a skilled multilingual professional around, but the latest tests have shown that humans are still far superior when it comes to accuracy and quality.
For a start, machines don’t understand the differences between languages. Each language has its own grammatical conventions, nuances, quirks, colloquialisms and so on. Translation isn’t science or maths and can’t be reduced to a standardised set of processes.
You can see the limitations of technology by typing a few paragraphs into Google Translate. You’ll get a translation that will give you a reasonable understanding of the original sentence, but the grammar and structure is all over the shop. Throw a confusing word or phrase into the mix and the system can break down, giving you gobbledegook (or Googledegook). Now imagine trying to use this technology to carry out complex business negotiations.
Furthermore, machines can’t interpret meaning or understand context. The most advanced and brilliant piece of technology can calculate precise algorithms in a fraction of a second, but it has no external awareness. It doesn’t understand how cultural norms can influence language, how context or tone can completely change the meaning of words.
How can a machine tell the difference between light-hearted and deadly serious? How will it deal with words that have multiple meanings, such as jam, bolt or bark?
Even more tricky for machines. How will they ever wrap their operating systems around non-verbal language? Body language, facial expressions, gesticulations. Even the stiffest, most uptight person in the room gives non-verbal cues. In fact, an estimated 55 percent of our communication is non-verbal. 93 percent if elements such as tone are included.
Considering these points, plus the fact that even the most commonly used machine services such as Google Translate are only available in around 80 of the world’s 6,000 or so languages, it’s unlikely that the need for language skills among humans will die off just yet. Until we get something that matches the inbuilt Universal Translator technology on Star Trek, we’re still far better off with business executives who can do business in other languages.