- Posted by Stephen Whiteley
- On 20/05/2020
- glossary, Neural Machine Translation, NMT, Translation, translation memory, Translation software
Ensuring consistency and quality with Neural Machine Translation
When we are translating text for any service or product, it is paramount to ensure that we translate terminology consistently. For example, if your product is a “Fair-Trade Cotton tee-shirt”, it is important that each time, in every instance this product is promoted, it has the same name. Translating it on your website as an “Ethically-sourced T-shirt” and in your catalogue as an “Organic Cotton Tee” will be confusing for potential customers.
Imagine if your company is a multinational like Zara or H&M, producing collateral, websites and TV commercials in 30+ languages, across multiple regions… The time required to monitor consistency would be considerable and frankly, unsustainable. Technical assistance is very useful.
Translation Memory and Glossaries
Machine Translation has, of course, the capacity for translation memory. However, to ensure consistency a glossary is essential. At some point, you must make a decision about the exact terminology your company will use. Not just to keep it consistent, but also to ensure the tone is appropriate in each country. This is especially important if your Brand “voice” is colloquial and informal.
A glossary will collate and standardise your terminology across languages, regions, collateral and platforms. It will connect with your translation memory in each language and across all documents. This terminology will then be available, via CAT tools, to all your translators, on their desktop.
What if your translator is a machine?
This is possible with statistical machine translation, as well as standard CAT tools. A glossary can impose which resources the machine uses to produce an automated translation (or which resources your human translator has available to them). In every case, a human maintains control of teaching the machine.
Neural MT, using artificial intelligence, represents a risk for being “too clever”. This autodidact tool hardly accepts human orders for filtering or for selecting given terminology. Of course, everything is feasible in IT development, but it becomes more and more complex and costly in resources and time. How many companies really have the resources to build their own translation AI?
Nevertheless, it is vital to consider the quality of the final copy with the unavoidable increased use of tools based on artificial intelligence.
NMT can be forced to use only very high-quality terminology resources, i.e. highly reliable, normative terms. But this only works, as long as you keep your glossary well-maintained and up-to-date. Along with adding new terms, it is important that old and outdated terms are removed to avoid continued use.
Human and machine integration ensure consistency and quality
This is a new challenge in terminology management. One good solution is to give more resources to the pre-translation phase, as well as increased focus on post-translation editing and quality control.
In the evolution of translation tools, we notice that whenever the machine demonstrates a new way of “thinking” there is always a concern about the impact on quality and linguistic consistency. And every client requires assurance that their terminology will continue to be correct and consistent.
In short, both clients and IT professionals need to be aware of the importance of terminology. They must closely cooperate with linguists, and account managers, to integrate new technological tools in the most effective way. We can all profit from Machine Translation, but only at the appropriate stage in the process.
At Quicksilver, we are confident that humans are here to stay!
You may also enjoy: