Posts Tagged ‘Technologie and software’

Quick wins for technical writing – how local LLMs can speed up your daily work

Posted on: February 27th, 2026 by Frank Wöhrle No Comments

Find out why local large language models (LLMs) are considered an insider tip for technical writing and how you can use AI to automate routine tasks.

Is your company yet to discover knowledge-based chatbots? Do you want to ensure that your sensitive data does not leave the computer? If so, we have a host of practical tips to get you off to a good start.

Why a local LLM makes sense for technical writers

Whether product changes, API updates or new features – technical documentation often needs to be adapted at short notice. This is precisely where local LLMs such as Ollama come into play:

  • Data sovereignty: Your content remains in-house.
  • Offline capability: It can be used even without an internet connection.
  • Cost-effectiveness: No ongoing cloud fees.
     

Please note: Ensure that you collaborate with your IT security department with regard to installation and configuration – safety first!

A number of suitable local LLM environments are now available that can be used to perform simple tasks in the technical writing office. We took a closer look at the Ollama platform.

 


Ollama at a glance – the local LLM platform

Screenshot LLama Software

 

Who is behind it?

Ollama is an open-source platform for running large language models (LLMs) locally. It was developed by Jeffrey Morgan (CEO) and Michael Chiang, the brains behind Kitematic, now part of Docker.

Over 156,000 GitHub stars (as of 2025) show that the project is growing rapidly. Supported by Y Combinator, Ollama remains community-driven. The platform enables you to run various LLMs locally on your computer (Windows, MacOS, Linux). Unlike cloud-based services, your data remains under your control.

Advantages of Ollama

  • No cloud uploads of confidential data
  • Integration of existing reference documents
  • Rapid, repeatable text generation for manuals, API documentation or help files
     

Recommended models

  • Llama3 / Llama3.1 – good balance of size and speed
  • Mistral – ideal for short, precise sections of text
  • Gemma3 – strong at text suggestions and summaries
     

After Ollama has been launched for the first time, the model is loaded locally. It is easy to install using an installer, and the application can then be launched via a user interface or from the CMD/Powershell console.

 


Get your local LLM ready to go in just a few steps

1. Installation

Download Ollama at https://ollama.com/download and follow the installation instructions. Afterwards, the loveable llama will welcome you.

Screenshot LLama Software

 

2. Download a suitable model

After installation, you can download a local model that is suitable for technical documentation or use a model that is already installed.

Open Ollama, click on “Download” and select, for example, llama3.2, gemma3:4b or mistral.

Screenshot LLama Software

Alternatively, you can download additional models via the console in Windows (CMD or Powershell):

To do this, use the command: ollama pull llama3:8b

Screenshot LLama Software

Wait until the model is fully downloaded – and you’re all set.

 


Hands-on: How to create a new section of text based on a reference document

Starting point

  • Reference manual (e.g. Word or PDF) with older version of the documentation is available.
  • New features or modified technology that require the document to be revised or reissued.
     

How it works: Update your manual efficiently – with local support

  • Content reuse: Analysis of existing reference documents (PDF, DOCX) with LLM
  • Structuring: Automatic generation of headings, lists and paragraphs
  • Terminology harmonisation: Matching of technical terminology with targeted prompts

 

Practical tips: Prompt engineering for local LLMs made easy

The key to success lies not in the model itself, but in something known as “prompt engineering”. Since we are using a relatively small local model compared to GPT-4, we need to be very precise here. For our example scenario, we would like to create the new function “Comments at segment level” based on an existing manual.

Prompt for creating a chapter

We came up with the following prompt for our enquiry. It is important to refer to the reference document, which you can insert using drag and drop.

Screenshot LLama Software

And the likeable llama replies:

Screenshot LLama Software

 

Example prompts for further useful queries

# Summary of content
“Summarise the contents of this file in 10 bullet points.”

# Creation of a new section
llama3 model: “Create a new section for the ‘Batch Export’ feature in the style of the reference document. Use short, clear sentences and a level 3 heading. Focus on step-by-step instructions.”

# Formatting adjustments
“Convert all step lists in the following text into numbered lists. Keep the technical terms, but simplify the wording slightly.”

Our tip: Always provide the reference document – this ensures that style and structure remain consistent.

 

Saving and post-processing

Even good AI needs editorial fine tuning:

  • Check the technical accuracy of the generated content.
  • Adapt the wording to your editorial style.
  • Add screenshots or diagrams if necessary.
  • Test the steps described to ensure they are correct.
  • Try out different models that suit your requirements.

 

Summary: Data protection meets productivity

Local LLMs such as Ollama open up new possibilities for creating technical documentation – without any dependence on the cloud. With precise prompt engineering and targeted post-processing, you can achieve significant time savings without compromising data protection and data sovereignty. Give it a try: you’ll be surprised at how quickly you achieve initial results.

 


More quick wins for technical writing

Scope of applicationDescription
Automatic structuringLong passages of continuous text are separated into clear headings, lists and paragraphs.
Template creationA reference document is used to create a standardised template (e.g. consistent structure for “Function description”, “Prerequisites”, “Examples”).
Linguistic harmonisationAll sections are harmonised to ensure a consistent style and tone (e.g. informal vs formal, passive vs active).

 


What you should bear in mind with local LLMs

Technical limitations

Context window:

  • Problem: Large documents (> 50 pages) → Possible loss of information during processing by the LLM
  • Solution: Chapter-by-chapter processing
     

Risk of “hallucinations”:

  • Problem: Fictitious technical details in the generated content
  • Solution: Prompt modification with restrictions, e.g. “Only change the explicitly highlighted area,” “Do not invent technical specifications.”

 

Compliance

  • GDPR: Compliance guaranteed through on-premises operation
  • Security audit: Have your IT security team conduct risk analysis prior to implementation
     

 


Local LLMs: Limits today – prospects for tomorrow

When local systems reach their limits – for example, when it comes to team-wide collaboration or larger volumes of documentation – the next step is clear: an integrated, cloud-based solution.

Get in touch with us – we’d be happy to show you how to optimally integrate AI into your technical writing processes.

Terminology in the translation process: Why it’s more important than ever in the age of AI

Posted on: October 30th, 2025 by Frank Wöhrle No Comments

Terminology work has always been the key to consistent, high-quality translations. The advent of AI and large language models (LLMs) is fundamentally changing translation processes. But this is precisely what makes terminology even more important.

“AI now translates everything perfectly – so why do we still need terminology management?”
As a language service provider, we are increasingly hearing this question being posed. But anyone who has ever seen how a single mistranslated technical term can distort a product description, manual or marketing message knows that terminology is not a side issue – it is the foundation of high-quality translation.

In the age of LLMs and generative AI, the how of handling terminology is changing. The why, however, remains unchanged.

Why terminology is the backbone of every translation

Terminology is much more than just a dictionary. It defines how a company talks about its products, services and values.

Whether we’re talking about “controllers”, “control modules” or “control units”, the right term ensures recognition, trust and legal certainty.

Without consistent terminology, inconsistencies are bound to arise. In practice, this leads to:

  • different translations for the same term,
  • increased correction effort,
  • unnecessary additional costs, multiplied by the number of target languages,
  • inconsistent brand communication,
  • misunderstandings among customers or users.


Consistent terminology management is crucial for ensuring consistency and precision in the translation process, especially when dealing with high-volume, multilingual content from the fields of technical documentation or marketing.

From glossaries to integrated solutions: Successful translation processes thanks to terminology

In the past, terminology work was often handled outside of the actual translation process – in the form of Excel spreadsheets or static lists. Today, it can be seamlessly integrated into translation management systems (TMS).
This enables:

  • automatic terminology suggestions directly in the CAT tool,
  • terminology checks during translation,
  • centralised maintenance and approval processes.


This makes terminology a living part of the workflow, not just an afterthought during quality control.

How AI and LLMs are changing terminology work

AI systems and LLMs open up new possibilities for maintaining terminology in a more dynamic and intelligent way. Some specific applications may include:

  • AI terminology extraction:
    AI can quickly analyse multilingual texts to automatically recognise relevant technical terms and suggest them as term candidates. This saves time during the creation phase and helps to identify terminology that has not been taken into account previously. However, final validation remains the task of human experts.
  • Building a terminology database:
    If translations or a defined structure have yet to be established, generative AI can support the creation of a terminology database. This allows variants and synonyms to be clustered efficiently, while metadata such as context, grammatical information or suggested definitions are generated automatically. However, the final review and validation stages are still handled by humans.
  • Terminology checks by AI:
    Terminology errors identified by a rule-based check are sent to the AI, where they are evaluated and corrected in their overall context, taking into account additional terminological information.


These new approaches make terminology work faster, more scalable and more data-driven. At the same time, it remains dependent on human validation – because AI does not automatically understand corporate language or brand values.

Limitations and risks: When AI ‘invents’ terms

As powerful as LLMs are, they also pose a real risk. This is because an AI model can:

  • ’hallucinate’ terms – i.e. it can create plausible but incorrect terms,
  • overlook customer-specific requirements if these are not clearly specified in the prompt or system,
  • confidential terminology data is at risk if it is fed into publicly accessible systems.


The conclusion? AI can support, but not decide. Human expertise remains indispensable when deciding whether a term is terminologically correct, brand-compliant and contextually appropriate.

Best practices: How we combine human expertise with the power of AI

As a language service provider, we see the added value in using technology sensibly – not automating everything blindly. Successful terminology work in the age of AI is based on five principles:

  • Centralisation:
    All terminology data belongs in a central database – not in miscellaneous lists scattered far and wide.
  • Integration:
    Terminology must be directly linked to CAT tools so that translators can access it in real time.
  • AI as a support, not a substitute:
    AI tools can assist with research, extraction and checks – but final validation remains in human hands.
  • Security-conscious:
    Sensitive terminology data should only be processed in data protection-compliant, controlled systems.

 

Terminology remains strategic corporate knowledge

Artificial intelligence and large language models are fundamentally changing how we work with language – but they are no substitute for terminology management. When used correctly, they actually make it more efficient and intelligent. Terminology is a company’s linguistic memory.
Particularly in the age of generative AI, clear and well-defined terms are crucial to ensure that man and machine truly speak the same language.

Contact us if you want to build your terminology efficiently, maintain it consistently and optimise it with AI support – we will support you every step of the way.

Learn more about our services in combination with AI for efficient terminology management

Transit NXT Service Pack 18: Smart functions for translation workflows – now available!

Posted on: September 9th, 2025 by Frank Wöhrle No Comments

With Transit NXT Service Pack 18, the STAR Group is introducing a powerful extension to its translation memory system – a real game changer for professional translators and project managers!

What’s new? – Highlights at a glance

  • Various improvements: The update enhances existing functionalities to make translation even more efficient. A variety of user requests have been implemented
  • Translation of variables in InDesign documents: A particularly exciting addition is that variables in InDesign documents can now be translated directly. This gives translators additional flexibility when dealing with complex layout files.

Why is it worth upgrading to Service Pack 18?

  • Machine translation: DeepL Pro now also supports language variants in glossaries (e.g. for English, Portuguese and Chinese). For Textshuttle, you can now control whether terminology from project dictionaries is transferred to Textshuttle or not.
  • Project exchange: Transit NXT now supports Phrase projects. Users can unpack MXLIFF files directly, translate them and import them back into Phrase.
  • Optimised web search: In the integrated web search, the prioritisation of services has been optimised in order to obtain initial results even faster.

Which stakeholders benefit most from SP 18?

  • Professional translators who frequently work with DTP tools such as InDesign and want to edit the content of variables efficiently.
  • Project managers who want to equip their teams with an even more powerful CAT environment.
  • Companies that strive for high quality, speed and flexibility in localisation.

Follow this link to download the Service Pack

You can find more information at: https://www.star-deutschland.net/en/technology-and-software/software-products/

Transit NXT: The underestimated CAT tool that has the professionals convinced

Posted on: July 2nd, 2025 by Frank Wöhrle No Comments

Anyone who regularly works with CAT tools (computer-aided translation software) probably thinks of Trados Studio, memoQ or Across first. One name is often overlooked – and unfairly so: Transit NXT: the underestimated CAT tool from the STAR Group. It is a genuine powerhouse for anyone who wants their work to be structured, consistent and terminology-focused.

What actually is Transit NXT?

Transit is a professional CAT tool that has been on the market since the 1990s. It combines classic segmentation with a project-orientated working method – incorporating translation memory, terminology management, preview options, quality checks and various functions designed specifically for technical documentation.
The extensive and growing portfolio of AI features, which are demonstrated in a series of short videos on our YouTube channel, are not to be missed.

5 reasons why so many professionals have put their trust in Transit NXT for years

1. Up-to-date and contextualised terminology

Transit works seamlessly with TermStar. Live terminology entries are displayed to translators within the editor itself – including the definition, context and source of the term. This extensive integration is a clear benefit over those tools where the terminology often features only in the sidelines.

2. Project structure, not file chaos

Unlike other CAT tools, Transit thinks in terms of projects with a clear-cut file structure. This takes the hard work out of managing big or lengthy translation projects – especially when it comes to regular updates or complex workflows.

3. Need technical formats? No problem with Transit.

Whether DITA, XML, FrameMaker, InDesign or XLIFF – Transit leads the way when it comes to the variety of natively supported file formats. Many other tools need extra modules or conversions to handle these files.

4. Local installation – full data sovereignty

Transit NXT works entirely locally – without any cloud obligations. For companies that have high data protection requirements, this is a crucial advantage over cloud-based solutions.

5. Quality assurance at the highest level

With automated checks, an in-context preview function and variant check, Transit NXT offers precise quality management for an impressive level of efficiency that is especially beneficial for those handling technical content.

 

Transit Software Bedienoberfläche

Who is Transit most suited to?

  • Technical translators working with complex formats.
  • Public authorities, industrial companies and service providers who need to keep sensitive data locally.
  • Freelancers who attach great importance to a reliably maintained terminology.
  • Translation agencies that want an efficient tool for managing large structured projects.

Sound good?

Transit NXT is no entry-level tool – but that is precisely what makes it a great option for anyone who values structure, terminology and format variety.

If you want to see for yourself how Transit works, simply request a non-binding trial version now.

STAR is a certified SCHEMA ST4 translation service provider

Posted on: May 30th, 2025 by Frank Wöhrle No Comments

We have successfully completed the training to become a certified translation service provider for the SCHEMA ST4 content management system. As such, STAR Deutschland is now an official certified translation service provider for SCHEMA ST4.

What is SCHEMA ST4?

SCHEMA ST4 is a professional content management system that more and more companies are turning to when producing technical documentation. It assists users in the creation, management and publication of multilingual product documentation (manuals, instructions, catalogues, online guides, etc.).

SCHEMA ST4 is an XML-based editing system that separates the layout from the textual content. In technical documentation, this is very beneficial when reusing text fragments and when managing multiple languages and versions.

SCHEMA ST4 finds application in a broad spectrum of industries, e.g. in the automotive sector, in mechanical and plant engineering or in pharmaceuticals. One major benefit of this system lies in the extensive optimisation of the translation process, which in turn reduces costs.

Training content and key training topics

The “Translation Management” training programme covers the various steps of the translation process, namely:

  1. Selecting the right text fragments
  2. Exporting the text content for translation, if necessary using COTI
  3. The subsequent import of the translated content into SCHEMA ST4

The training also offers insights into potential challenges that may be encountered, both in terms of the editing and the translation.

Translation process for SCHEMA ST4 content

The SCHEMA ST4 content management system is one of the most frequently implemented solutions in technical editing among STAR’s customers.

Let us assist you with our in-depth knowledge of the SCHEMA ST4 translation interface and the related processes.

Get in touch now with no obligation!

Translate more efficiently with AI – but how?

Posted on: February 26th, 2025 by Frank Wöhrle No Comments

AI: What started as a buzzword, and then became an established term in everyday language, is now a basic requirement for many applications and processes. And this technology is not stopping at the language industry either. Since the launch of ChatGPT we know that translating can now also be completely interactive. Large Language Models, also know as LLMs, in chatbot form are now flooding the market. It feels as though a new model is popping every week, announcing its intention to outdo its competitors in terms of efficiency, quality and reliability. Neural machine translation (NMT) doesn’t seem that old – and yet we are already discussing when this technology will disappear from the market and be replaced by generative AI.

The key question is: I want to translate more efficiently with AI – but how?

AI for targeted optimisation of translation quality

Even though the technology has made significant progress over the last five years, the results of the commonly used and established NMT systems are not always good enough. This can have a variety of causes:

  • The desired language combination has not been trained with sufficient material or goes via a pivot language (often English). This can lead to structural problems or errors in meaning.
  • The MT system does not recognise specialist or customer-specific terminology.
  • The MT system was used for content in which style is extremely important or the translation needs to be targeted towards a specific target group.


Manuals, marketing texts or content with high customer visibility therefore often do not achieve the desired levels of quality through machine translation alone. Language professionals then optimise the machine-generated texts as part of a post-editing process. Machine translations are carefully checked, compared with the source text and corrected if necessary.

As a central translation platform, the CAT tool enables users to work efficiently and offers targeted support for quality assurance thanks to a range of automated features. But where exactly is AI being used here? LLMs such as ChatGPT from OpenAI are perfectly capable of producing translations that, like DeepL or Google Translate, provide a good starting point for further processing, depending on how it is to be used.

However, a significant leap in quality can be achieved by improving the translation requests through the targeted use of prompts and the addition of reference files. To achieve this, however, in addition to a well thought-out prompt engineering design, validated translation resources in the form of translation memory and terminology databases are a fundamental prerequisite.

 

AI for better translation resources

As with any new technology, a question often arises: What can AI do for me?

However, if you want to integrate AI into your language processes in the long term, you should first ask yourself: What can I do for AI?

Well-maintained translation resources make a significant contribution to improving the results of your AI solution. Take the topic of terminology, for example. If you use a generic system such as DeepL for your translation processes, you will receive translations that do not match your company terminology – unless you integrate a glossary.

Are you only at the stage of establishing your terminology but don’t want to miss out on the benefits of MT? Use language models to extract potential terminology from your monolingual or multilingual documents. You can also use AI to check your translation memory databases, for example to find inconsistent translations or to automate clean-up or correction across large data sets. Use these resources consistently to increase the translation quality of your language model or improve the output of NMT systems.

AI as co-pilot? Reach your destination safely with the new STAR webinar series

As you can see, we are extremely enthusiastic about the topic of AI – and we don’t claim to be reinventing the wheel. However, the technology offers a lot of potential for optimisation if it is used efficiently and sustainably.

Of course, we would like to share our enthusiasm with you and invite you to our webinar series “AI as co-pilot: Forging new paths to smart language processes” starting in March. All the webinars will be held in German only.


How exactly does generative AI actually work? What advantage does it offer for the translation? How can I use language models to create product texts? Can I train my own AI? And what actually happens to my data?

Our Language Technology Consultant Julian Hamm answers these and many other questions and discusses the many different uses of generative AI, including translation, terminology management, content creation and content delivery. You can expect the following content in the first group of topics:

  • Was ist generative KI, und wofür kann ich sie einsetzen? (What is generative AI and what can I use it for?)
  • Wie kann KI bei der Übersetzung unterstützen? (How can language experts benefit from AI?)
  • Wie kann ich KI für die Terminologiearbeit einsetzen? (How can I use AI for terminology work?)
  • Welche Vorteile bietet KI für die Content-Erstellung? (What advantages does AI offer for content creation?)


Further information on the events and the registration form can be found here.

We look forward to you joining us!

Transit NXT Service Pack 17: New AI functions and more

Posted on: January 31st, 2025 by Frank Wöhrle No Comments

With the latest Transit NXT Service Pack, you can benefit from a host of new features to speed up your processes.

New file formats

Transit NXT has been once again expanded to include the very latest file formats with Service Pack 17. Documents from InDesign 2025 and drawings up to AutoCAD 2025 can now also be translated as optional file versions. Documents from the Google Docs Editors Suite are now also officially supported as another file format.

Machine translation

Integration of Amazon Translate.

With the addition of another MT system, translations can now also be requested via Amazon Translate – interactively directly in the translation editor, or as an option automatically when importing the project. New functions are also available for DeepL, Systran and Textshuttle.

Project management

Professional support in the editor thanks to integrated MS Word grammar check, AutoCorrect and AutoComplete functions.

TermStar

For TermStar, we are focusing on terminology export for this Service Pack: For one, TBX version 3 is now officially supported. What’s more, this Service Pack also makes it possible to export multimedia files (e.g. graphics or videos) from dictionaries to certain formats.

New editor functions

Translators can look forward to additional helpful editor functions:

The AutoComplete function makes it quicker to enter words and phrases with project-specific suggestions from dictionaries and translation memories.

AutoCorrect corrects typical typos, typographically converts quotation marks and makes it possible to use shortcuts to enter special characters and frequently used phrases. Date and number formats as well as alphanumeric strings can now be adapted to the target language format with a simple mouse click.

For quality assurance, the translation can now also be checked for correct grammar and corrected interactively. What’s more, Russian for Kazakhstan is now available as an additional working language.

Follow this link to download the Service Pack

You can find more information at: https://www.star-deutschland.net/en/technology-and-software/software-products/

2024 – the year of AI: Important developments and lessons learned

Posted on: December 16th, 2024 by Frank Wöhrle No Comments

Another year is drawing to a close, and we can hardly believe how fast the time has flown by. Now is a good opportunity to take a look back at all of the important developments that 2024 – the year of AI – has brought us, and give you an insight into what next year has in store for us.

AI has been a hot topic ever since OpenAI stunned the whole world with ChatGPT. Companies are increasingly insisting on using AI wherever this seems possible. From many discussions and exciting customer projects over the course of the year, we have identified key lessons learned and trends in this field.

Five key trends relating to the use of AI in the context of translation

  • Expectations for generative AI remain very high.
    However, the purposes for which people want to use it differ greatly, especially in language processes: From the fanciful idea of a wonder machine which produces, translates and optimises texts so they are perfect, through to a clever tool that provides targeted assistance with specific tasks that are usually performed manually at present. The increasing integration of large language models into translation processes makes exactly this possible by providing these with targeted and modular support. This ranges from the bilingual extraction of terminology and the post-editing of machine-translated content, through to assessing the quality of multilingual documents.
  • If you want to use the terminology efficiently and sustainably, you also need high-quality, well-structured language resources to be able to supply the language models with relevant information.
    This means that years of working with translation memory and terminology management systems now offers double the benefits. If this data is prepared in a structured and sustainable manner, language models can use it to optimise machine-translated content, for instance in the form of retrieval-augmented generation (RAG).
  • The topic of data protection continues to generate extreme uncertainty despite the adoption of the EU AI Act in May 2024.
    Many companies are looking for ways to use AI in the most secure possible way in order to protect their precious data against misuse.
  • A lot of businesses are experiencing issues with the scalability of AI solutions, whether this concerns the IT infrastructure, financial resources or further training of staff.
  • Human in the cockpit. People will increasingly return to the centre of the AI-based translation workflow.
    While translators were previously responsible for the post-editing of predefined machine-translated content, among other tasks, as part of human in the loop concept, the new human in the cockpit principle aims for translators to use modern language technologies – even interactively – in order to exert their own influence on the output and to create efficient design processes.
    The technological transformation is also resulting in changing requirements for current and future language experts. The relevant universities have also recognised these developments and are revising the degrees and courses they offer accordingly. For instance, prompt engineering, language technologies and information management are important focal topics that will feature more often on the curriculum in future.

Are you interested in this subject? Then don’t miss our STAR webinar, which is scheduled for early 2025. There, we will be sharing information about current trends and our latest technological developments.

AI for voices, voice recordings and voice-over translations

Posted on: October 28th, 2024 by Frank Wöhrle No Comments

Can AI help to create high-quality content in any language while adhering to corporate language and specific rules?

Today we’re interviewing David Heider, the owner of a STAR partner sound studio in the Czech Republic, to shed light on this fascinating question – can artificial intelligence be effectively used in the area of video and audio productions?

STAR: David, when did you start offering professional audio productions?

Our recording studio has been providing its services since 1999 and we’ve specialised in the spoken word. We cover two different areas. Firstly, the “corporate world”, with recordings of material for internal purposes, such as e-learning. This also includes localisation of internal company systems and software. This can be either training material or various web-based platforms with voice output or automatic operators on your phone, sat nav, etc.– in short, various applications where we often have to cut the sound word by word or even syllable by syllable and where everything is then put together by a system into sentences and whole messages.

The second area is more artistic in nature and covers advertising and promotional videos, among other content. This area differs from the “corporate world” previously mentioned in that it’s not just about conveying content, but rather about a form that appeals to listeners and attracts them. So we need professionals who can express themselves artistically and use their voice skilfully. To summarise, you might say that our first area of action is to provide information. This is about content where users, to put it more clearly, don’t have much choice, as they generally have to listen. In contrast, artistic productions aim to seduce the “audience” in some way, not only in terms of content but also their form.

Tonstudio

STAR: This inevitably leads me on to the next question – can AI be used in your work?

AI is an amazing tool and offers numerous advantages. For example, we don’t need to contact a voice-over artist and make an appointment; the AI is always available.

STAR: Are you already using AI?

Yes. We use AI in some cases for preparing and producing audio material. But there’s also a downside. In most languages, the AI voice seems artificial or boring, especially after listening to it for a long time.

STAR: Can’t AI intonate?

Intonation in itself isn’t usually a problem, but the AI does it in unnatural inflections, which is really inconvenient. Often it doesn’t emphasise the core message, which a person would normally express through a particular emphasis. And when you listen to an AI recording, you get this unnatural inflection on repeat that starts to get annoying after a while, because you can’t shake the feeling that it’s actually just “copy-paste”. In comparison, I find it much better in English than in other languages, where the AI can work with variable intonation and make the voice sound very natural and lively. But in all the other languages, we still have a long way to go before we reach that point. At the moment, the other languages still sound very “plastic”.

STAR: Are there any other disadvantages to AI voices?

There’s a second point that I think is more serious, especially with e-learning. As with any AI, the quality of the output depends on the quality of the input. You also always have to prepare the content correctly for AI voices. Perhaps the AI doesn’t read all the abbreviations correctly, e.g. in the same way as you would read them in a specific corporate culture. Every company has its own corporate jargon and the AI won’t take this into account. This also applies to different product names, place names and foreign words. For example, if French names appear in English text, should it be read in French or English?

STAR: How can this be explained?

Only the employees at a company are really familiar with the corporate language and know why a certain linguistic rule can sometimes be ignored for internal company content or marketing reasons. And the listeners are insiders, i.e. they usually know what the content’s about. Companies also have to be consistent, otherwise it will sound strange to their ears. Sometimes, of course, a term or abbreviation can be misunderstood, either phonetically or for names, but that’s just the way it’s done at the company and we should respect it.

STAR: What other challenges are there?

Abbreviations and other specific features are a major challenge for AI. They usually need a lot of adjustments and corrections, which can result in the final price being similar to that of a traditional voice-over. We need to create pronunciation tips or edit the text so that it’s easy for the AI to read. This is very time-consuming – so AI makes little sense for a one-off project. In addition, we also “proof-listen”, i.e. do a listen-through to check, after the AI.

STAR: Don’t you “proof-listen” for human speakers too?

If there are two of us in addition to the speaker during the recording, we don’t do this any more because we can hear and check everything during the recording. The exceptions are languages that we don’t understand, such as Asian languages. But, in the case of AI, we don’t know beforehand what it knows and what it can read. I’ll give you an example. Let’s take the unit of a “megapascal”. This term has the abbreviation “MPa”, and the AI can read it as “em-pee-ay”, which is complete nonsense to a technical expert. So we’ve got to figure out how to get the AI to read it correctly as “megapascal”.

Sometimes we go through the recording and it seems right to us, but then the customer finds something that doesn’t fit their corporate culture. That’s why, while I think AI is a useful tool in certain informational texts that can make work faster and cheaper, and I’m happy to recommend it, in the hands of an inexperienced user, AI can behave unpredictably, and the end product will cause more disappointment than enthusiasm about the resources saved.

STAR: Is there a financial difference?

Yes, using AI reduces the budget to around half or two-thirds, as the work is mainly done by a machine and no voice professionals are involved in the process.

STAR: What do you do if a recording isn’t suitable for AI?

We are the guarantor of quality, and if we have serious and justified doubts about whether AI will lead to the right result, we’ll inform the customer. But customers also want to have personal experiences of this. I then try to point this out first by saying, “don’t be disappointed, but I don’t think AI is suitable for this particular project.” When I feel that I’ve outlined everything, I leave the decision up to them. But in some cases, customers themselves are unsure and are grateful for our support.

STAR: Thank you, David, for this very interesting discussion about AI in audio recordings.

Bild von David

AI voices aren’t yet perfect, and human voices are still winning the race. They’re able to convey emotions and leave a strong impression. However, AI voices are an inexpensive alternative. Please feel free to contact us for our advice.

David Heider,
owner of a STAR partner sound studio in the Czech Republic

How translations can be processed faster with COTI Level 3

Posted on: August 1st, 2024 by Frank Wöhrle No Comments

In the fast-paced world of the translation and localisation industry, efficiency is the key to success. One solution that can play an important role in delivering this efficiency is the Common Translation Interface (COTI) standard, particularly in its highly developed form – COTI Level 3. But what exactly does this standard entail and how can it speed up translation processes?

What is the COTI standard?

The Common Translation Interface (COTI) standard was developed specifically for the translation and localisation industry to improve interoperability between different software tools and systems. The COTI standard defines a manufacturer-independent format for exchanging data between translation memory systems (TMS) and editorial systems, such as content management systems (CMS) and other tools used in the industry.

Higher COTI level, more automation

COTI levels build on each other and offer varying degrees of integration and automation:

  • Level 1 – core features: Translation data is saved in a defined structure, compressed as a ZIP file with the extension .coti and enhanced with meta information. The data is transferred manually, but the meta information and fixed structure make it easy for the receiving system to interpret the packets.
  • Level 2 – extended features: At this level, the transfer of COTI data packets becomes automated. The editorial system generates a package that is automatically recognised and imported by a TMS as soon as it is placed in a shared transfer folder (hotfolder) that is constantly monitored. Meta information enables the receiving system to create an automated order system, for example.
  • Level 3 – expert features: The highest level of integration offers fully automated data transfer between the systems. This removes the need to create or monitor packages manually. Instead, translation data and meta information is transferred via an API between the editing system and the TMS. Not only translation data, but also status information such as translation progress can be transmitted.

 

Diagram of the COTI workflow between customer and language service provider. On the left is "Customer" with the items CMS, PIM and ERP, on the right is "Language service provider" with the items Translation, Terminology and Review. In the centre, a double arrow shows the data transfer from COTI level 1 to 3.

Benefits of full automation with COTI Level 3

The implementation of COTI Level 3 brings with it several benefits that can dramatically improve the translation process:

  • Fast data transfer: Thanks to the fully automated API, translation data is transferred seamlessly between systems without any delay.
  • Increased efficiency: Large and complex translation projects can be processed more efficiently, since the workflow no longer has to include any manual steps.
  • Round-the-clock operation: Automation facilitates continuous operation without human intervention, resulting in round-the-clock availability of translation data.
  • Security: By eliminating manual steps, the risk of human error is minimised, which in turn ensures data transfer is more secure.
  • Time and cost savings: Full automation leads to significant time savings, while also reducing the operational effort and costs involved in translation projects.

Conclusion

The introduction of COTI Level 3 signalled a major advancement in the translation industry; one which not only increases efficiency, but also improves the quality and reliability of translation processes. Through seamless integration and automated data transfer, companies are able to expand their global reach while also saving time and resources.

The following editorial systems can currently use COTI packages of various levels:

    • TIM – Fischer
    • AEM – Adobe
    • and much more besides

 

With our translation memory system STAR Transit NXT; and our workflow solution STAR CLM, we provide links at all three levels – in order to transfer data efficiently, securely and quickly and to speed up translation processes.

We process your COTI packages automatically using STAR CLM! 

Contact us for tailored advice