Archive for the ‘Technology and software’ Category

Quick wins for technical writing – how local LLMs can speed up your daily work

Posted on: February 27th, 2026 by Frank Wöhrle No Comments

Find out why local large language models (LLMs) are considered an insider tip for technical writing and how you can use AI to automate routine tasks.

Is your company yet to discover knowledge-based chatbots? Do you want to ensure that your sensitive data does not leave the computer? If so, we have a host of practical tips to get you off to a good start.

Why a local LLM makes sense for technical writers

Whether product changes, API updates or new features – technical documentation often needs to be adapted at short notice. This is precisely where local LLMs such as Ollama come into play:

  • Data sovereignty: Your content remains in-house.
  • Offline capability: It can be used even without an internet connection.
  • Cost-effectiveness: No ongoing cloud fees.
     

Please note: Ensure that you collaborate with your IT security department with regard to installation and configuration – safety first!

A number of suitable local LLM environments are now available that can be used to perform simple tasks in the technical writing office. We took a closer look at the Ollama platform.

 


Ollama at a glance – the local LLM platform

Screenshot LLama Software

 

Who is behind it?

Ollama is an open-source platform for running large language models (LLMs) locally. It was developed by Jeffrey Morgan (CEO) and Michael Chiang, the brains behind Kitematic, now part of Docker.

Over 156,000 GitHub stars (as of 2025) show that the project is growing rapidly. Supported by Y Combinator, Ollama remains community-driven. The platform enables you to run various LLMs locally on your computer (Windows, MacOS, Linux). Unlike cloud-based services, your data remains under your control.

Advantages of Ollama

  • No cloud uploads of confidential data
  • Integration of existing reference documents
  • Rapid, repeatable text generation for manuals, API documentation or help files
     

Recommended models

  • Llama3 / Llama3.1 – good balance of size and speed
  • Mistral – ideal for short, precise sections of text
  • Gemma3 – strong at text suggestions and summaries
     

After Ollama has been launched for the first time, the model is loaded locally. It is easy to install using an installer, and the application can then be launched via a user interface or from the CMD/Powershell console.

 


Get your local LLM ready to go in just a few steps

1. Installation

Download Ollama at https://ollama.com/download and follow the installation instructions. Afterwards, the loveable llama will welcome you.

Screenshot LLama Software

 

2. Download a suitable model

After installation, you can download a local model that is suitable for technical documentation or use a model that is already installed.

Open Ollama, click on “Download” and select, for example, llama3.2, gemma3:4b or mistral.

Screenshot LLama Software

Alternatively, you can download additional models via the console in Windows (CMD or Powershell):

To do this, use the command: ollama pull llama3:8b

Screenshot LLama Software

Wait until the model is fully downloaded – and you’re all set.

 


Hands-on: How to create a new section of text based on a reference document

Starting point

  • Reference manual (e.g. Word or PDF) with older version of the documentation is available.
  • New features or modified technology that require the document to be revised or reissued.
     

How it works: Update your manual efficiently – with local support

  • Content reuse: Analysis of existing reference documents (PDF, DOCX) with LLM
  • Structuring: Automatic generation of headings, lists and paragraphs
  • Terminology harmonisation: Matching of technical terminology with targeted prompts

 

Practical tips: Prompt engineering for local LLMs made easy

The key to success lies not in the model itself, but in something known as “prompt engineering”. Since we are using a relatively small local model compared to GPT-4, we need to be very precise here. For our example scenario, we would like to create the new function “Comments at segment level” based on an existing manual.

Prompt for creating a chapter

We came up with the following prompt for our enquiry. It is important to refer to the reference document, which you can insert using drag and drop.

Screenshot LLama Software

And the likeable llama replies:

Screenshot LLama Software

 

Example prompts for further useful queries

# Summary of content
“Summarise the contents of this file in 10 bullet points.”

# Creation of a new section
llama3 model: “Create a new section for the ‘Batch Export’ feature in the style of the reference document. Use short, clear sentences and a level 3 heading. Focus on step-by-step instructions.”

# Formatting adjustments
“Convert all step lists in the following text into numbered lists. Keep the technical terms, but simplify the wording slightly.”

Our tip: Always provide the reference document – this ensures that style and structure remain consistent.

 

Saving and post-processing

Even good AI needs editorial fine tuning:

  • Check the technical accuracy of the generated content.
  • Adapt the wording to your editorial style.
  • Add screenshots or diagrams if necessary.
  • Test the steps described to ensure they are correct.
  • Try out different models that suit your requirements.

 

Summary: Data protection meets productivity

Local LLMs such as Ollama open up new possibilities for creating technical documentation – without any dependence on the cloud. With precise prompt engineering and targeted post-processing, you can achieve significant time savings without compromising data protection and data sovereignty. Give it a try: you’ll be surprised at how quickly you achieve initial results.

 


More quick wins for technical writing

Scope of applicationDescription
Automatic structuringLong passages of continuous text are separated into clear headings, lists and paragraphs.
Template creationA reference document is used to create a standardised template (e.g. consistent structure for “Function description”, “Prerequisites”, “Examples”).
Linguistic harmonisationAll sections are harmonised to ensure a consistent style and tone (e.g. informal vs formal, passive vs active).

 


What you should bear in mind with local LLMs

Technical limitations

Context window:

  • Problem: Large documents (> 50 pages) → Possible loss of information during processing by the LLM
  • Solution: Chapter-by-chapter processing
     

Risk of “hallucinations”:

  • Problem: Fictitious technical details in the generated content
  • Solution: Prompt modification with restrictions, e.g. “Only change the explicitly highlighted area,” “Do not invent technical specifications.”

 

Compliance

  • GDPR: Compliance guaranteed through on-premises operation
  • Security audit: Have your IT security team conduct risk analysis prior to implementation
     

 


Local LLMs: Limits today – prospects for tomorrow

When local systems reach their limits – for example, when it comes to team-wide collaboration or larger volumes of documentation – the next step is clear: an integrated, cloud-based solution.

Get in touch with us – we’d be happy to show you how to optimally integrate AI into your technical writing processes.

Making e-learning effective worldwide: Achieve real learning success with professional localisation

Posted on: November 27th, 2025 by Frank Wöhrle No Comments

E-learning is considered a central pillar of continuing professional development in many companies – from global onboarding courses to complex product training. At the same time, projects repeatedly face similar challenges: Content that works extremely well in the country of origin loses its impact in other markets, is misunderstood or simply not used. The reason for this rarely lies in the didactic concept itself, but rather in the type and quality of localisation.

Why companies rely on e-learning

From a business perspective, numerous factors speak in favour of digital learning formats. Employees can learn flexibly – regardless of location, time zone and device, which suits geographically dispersed teams.

E-learning supports independent learning at any time: Content is available on demand, without having to rely on specific training dates. Thanks to their modular setup, learning units can be clearly structured, specifically combined and, if necessary, updated gradually.

Another advantage is that the learning pace is down to each individual: Employees can pause, repeat or delve deeper into complex content without disrupting anyone else’s flow. In addition, digital training makes it possible to create customised learning content – tailored to specific roles, regions or target groups within the company.

Multimedia elements such as videos, animations, interactive exercises and quizzes create a rich learning experience and increase engagement. Offering content in multiple languages contributes significantly to accessibility and, for international workforces, makes real headway in terms of removing barriers to learning.

Ultimately, when these factors are successfully implemented, they lead to increased learning success – measurable in terms of knowledge transfer, application in daily work and reduced error rates.

Complexity of modern e-learning formats

In practice, it quickly becomes apparent that e-learning courses are significantly more complex than traditional training materials. A typical module includes slides or screen recordings, embedded videos, spoken commentary, subtitles and interactive elements such as quizzes, conversation simulations and the like.

When it comes to localisation, this means that content to be translated is not contained in a single file or format, but is distributed across a variety of authoring tools such as Adobe Captivate, Articulate Storyline, Articulate Rise, iSpring, Elucidat, Lectora etc., SCORM packages, video and audio scripts, and, where necessary, external sources (e.g. course descriptions, content in files linked to the course, where applicable) and, in some cases (video and audio scripts), still need to be transcribed before localisation. In addition, there are technical requirements – such as support for character sets, space restrictions in buttons, and synchronising subtitles and voiceovers.

Underestimating the complexity of this process often leads to problems during the project: Missing text exports, texts with similar content in different formats, untranslated UI elements or videos that have to be edited retrospectively at great expense. For a localisation process to run smoothly, therefore, a structured approach that takes all components into account from the outset is crucial.

Learning in your native language: an efficiency factor

From a didactic perspective, it is well documented that learning content is best internalised when delivered in one’s own mother tongue. Learners then need to expend less cognitive effort in understanding the language and can concentrate more on content, context and application.

This is particularly important when dealing with complex, security-related or legal issues in order to avoid misunderstandings and misinterpretations. Emotional access also plays a role: Language can influence how credible, esteemed and motivating a training course is perceived to be.

For companies, this means that even employees with good foreign language skills benefit from training in their native language – namely, by making faster and more stable progress in their learning. Those who systematically utilise these positive consequences are able to significantly increase the effectiveness of global learning programmes and, at the same time, justify the investment in localisation.

 

Lektorin sitzt lächelnd mit Headset an Schreibtisch in modernem Büro.

The role of professional specialist translations in e-learning localisation

In order for e-learning courses in other languages to achieve the same learning objectives as the original, a basic word-for-word translation simply cannot do the job.
Native-speaking specialist translators combine linguistic competence with industry knowledge and are familiar with the terminology and common phrases used in their respective fields of expertise.

They ensure that technical terms are used consistently, instructions are clear and action-oriented, and didactic subtleties are preserved.
At the same time, they adapt examples, metaphors or references if these are not readily translatable with regard to either culture or context.

Professional translation therefore makes a significant contribution to learning objectives being achieved quickly: Content is easier to understand, easier to remember and more likely to be put into practice.
A clearly defined terminology and review process also supports company-wide consistency, both in terms of corporate documentation and individual learning outcomes. This is especially the case when dealing with a vast number of courses and a multitude of languages.

The importance of professionally localised audio and video

Audio and video are key carriers of information and sources of motivation in modern e-learning courses – and pose particular challenges for localisation.
Voiceover texts must be translated in such a way that they match the visual material in terms of tone, length and rhythm, while at the same time being technically accurate.

For voiceovers, you also need to select suitable narrators or satisfactory AI software to suit your corporate image and target audience.
In addition to the voice, elements such as the narrator’s gender, age, pronunciation quality, accent and/or dialect, and any background sound such as music, etc., are crucial in order to avoid misunderstandings and convey an entirely professional feel.

The client’s specifications regarding desired pronunciation, use of abbreviations, accessible (i.e. gender-neutral) language, and so on, are essential, as the client’s satisfaction will be strongly tied to how well these requests are implemented.

Subtitles, on the other hand, must be precise, easy to read and synchronised with the spoken word. The wording must be concise as well as complete, with some rephrasing required.
Last but not least, visual elements – such as text overlays or UI screens – may need to be adapted to ensure that they remain understandable in the target language and fit in with the design in terms of form.

Licence for e-learning

Last but not least, when using human voices for voiceovers, the customer must also clarify the intended use and reach. Are the e-learning courses with voiceover to be used exclusively internally or also publicly? Are there any plans to sell the courses commercially to third parties?
Depending on the type of transmission and the type of media/rights usage, the recording studio involved in the project may charge a licence fee per voice artist used. In most cases, these are flat fees with indefinite validity.

Summary: Localisation as an integral part of the e-learning strategy

E-learning can only reach its full potential when content is tailored to the specific language and culture.
Companies wishing to roll out e-learning courses internationally would do well to consider localisation as an integral part of the conceptualisation process from the outset – rather than as a downstream translation step.

The combination of didactically excellent courses, native-language specialist translation and professionally localised audio and video elements forms the basis for true learning success in multiple languages.
This makes global training programmes consistent, efficient and effective – and fulfils the requirement to make knowledge accessible across the globe without ever compromising on quality.

Get in touch to find out how we can help in ensuring your learning content achieves exactly the international impact you want – we speak your language!

Terminology in the translation process: Why it’s more important than ever in the age of AI

Posted on: October 30th, 2025 by Frank Wöhrle No Comments

Terminology work has always been the key to consistent, high-quality translations. The advent of AI and large language models (LLMs) is fundamentally changing translation processes. But this is precisely what makes terminology even more important.

“AI now translates everything perfectly – so why do we still need terminology management?”
As a language service provider, we are increasingly hearing this question being posed. But anyone who has ever seen how a single mistranslated technical term can distort a product description, manual or marketing message knows that terminology is not a side issue – it is the foundation of high-quality translation.

In the age of LLMs and generative AI, the how of handling terminology is changing. The why, however, remains unchanged.

Why terminology is the backbone of every translation

Terminology is much more than just a dictionary. It defines how a company talks about its products, services and values.

Whether we’re talking about “controllers”, “control modules” or “control units”, the right term ensures recognition, trust and legal certainty.

Without consistent terminology, inconsistencies are bound to arise. In practice, this leads to:

  • different translations for the same term,
  • increased correction effort,
  • unnecessary additional costs, multiplied by the number of target languages,
  • inconsistent brand communication,
  • misunderstandings among customers or users.


Consistent terminology management is crucial for ensuring consistency and precision in the translation process, especially when dealing with high-volume, multilingual content from the fields of technical documentation or marketing.

From glossaries to integrated solutions: Successful translation processes thanks to terminology

In the past, terminology work was often handled outside of the actual translation process – in the form of Excel spreadsheets or static lists. Today, it can be seamlessly integrated into translation management systems (TMS).
This enables:

  • automatic terminology suggestions directly in the CAT tool,
  • terminology checks during translation,
  • centralised maintenance and approval processes.


This makes terminology a living part of the workflow, not just an afterthought during quality control.

How AI and LLMs are changing terminology work

AI systems and LLMs open up new possibilities for maintaining terminology in a more dynamic and intelligent way. Some specific applications may include:

  • AI terminology extraction:
    AI can quickly analyse multilingual texts to automatically recognise relevant technical terms and suggest them as term candidates. This saves time during the creation phase and helps to identify terminology that has not been taken into account previously. However, final validation remains the task of human experts.
  • Building a terminology database:
    If translations or a defined structure have yet to be established, generative AI can support the creation of a terminology database. This allows variants and synonyms to be clustered efficiently, while metadata such as context, grammatical information or suggested definitions are generated automatically. However, the final review and validation stages are still handled by humans.
  • Terminology checks by AI:
    Terminology errors identified by a rule-based check are sent to the AI, where they are evaluated and corrected in their overall context, taking into account additional terminological information.


These new approaches make terminology work faster, more scalable and more data-driven. At the same time, it remains dependent on human validation – because AI does not automatically understand corporate language or brand values.

Limitations and risks: When AI ‘invents’ terms

As powerful as LLMs are, they also pose a real risk. This is because an AI model can:

  • ’hallucinate’ terms – i.e. it can create plausible but incorrect terms,
  • overlook customer-specific requirements if these are not clearly specified in the prompt or system,
  • confidential terminology data is at risk if it is fed into publicly accessible systems.


The conclusion? AI can support, but not decide. Human expertise remains indispensable when deciding whether a term is terminologically correct, brand-compliant and contextually appropriate.

Best practices: How we combine human expertise with the power of AI

As a language service provider, we see the added value in using technology sensibly – not automating everything blindly. Successful terminology work in the age of AI is based on five principles:

  • Centralisation:
    All terminology data belongs in a central database – not in miscellaneous lists scattered far and wide.
  • Integration:
    Terminology must be directly linked to CAT tools so that translators can access it in real time.
  • AI as a support, not a substitute:
    AI tools can assist with research, extraction and checks – but final validation remains in human hands.
  • Security-conscious:
    Sensitive terminology data should only be processed in data protection-compliant, controlled systems.

 

Terminology remains strategic corporate knowledge

Artificial intelligence and large language models are fundamentally changing how we work with language – but they are no substitute for terminology management. When used correctly, they actually make it more efficient and intelligent. Terminology is a company’s linguistic memory.
Particularly in the age of generative AI, clear and well-defined terms are crucial to ensure that man and machine truly speak the same language.

Contact us if you want to build your terminology efficiently, maintain it consistently and optimise it with AI support – we will support you every step of the way.

Learn more about our services in combination with AI for efficient terminology management

Transit NXT Service Pack 18: Smart functions for translation workflows – now available!

Posted on: September 9th, 2025 by Frank Wöhrle No Comments

With Transit NXT Service Pack 18, the STAR Group is introducing a powerful extension to its translation memory system – a real game changer for professional translators and project managers!

What’s new? – Highlights at a glance

  • Various improvements: The update enhances existing functionalities to make translation even more efficient. A variety of user requests have been implemented
  • Translation of variables in InDesign documents: A particularly exciting addition is that variables in InDesign documents can now be translated directly. This gives translators additional flexibility when dealing with complex layout files.

Why is it worth upgrading to Service Pack 18?

  • Machine translation: DeepL Pro now also supports language variants in glossaries (e.g. for English, Portuguese and Chinese). For Textshuttle, you can now control whether terminology from project dictionaries is transferred to Textshuttle or not.
  • Project exchange: Transit NXT now supports Phrase projects. Users can unpack MXLIFF files directly, translate them and import them back into Phrase.
  • Optimised web search: In the integrated web search, the prioritisation of services has been optimised in order to obtain initial results even faster.

Which stakeholders benefit most from SP 18?

  • Professional translators who frequently work with DTP tools such as InDesign and want to edit the content of variables efficiently.
  • Project managers who want to equip their teams with an even more powerful CAT environment.
  • Companies that strive for high quality, speed and flexibility in localisation.

Follow this link to download the Service Pack

You can find more information at: https://www.star-deutschland.net/en/technology-and-software/software-products/

Transit NXT: The underestimated CAT tool that has the professionals convinced

Posted on: July 2nd, 2025 by Frank Wöhrle No Comments

Anyone who regularly works with CAT tools (computer-aided translation software) probably thinks of Trados Studio, memoQ or Across first. One name is often overlooked – and unfairly so: Transit NXT: the underestimated CAT tool from the STAR Group. It is a genuine powerhouse for anyone who wants their work to be structured, consistent and terminology-focused.

What actually is Transit NXT?

Transit is a professional CAT tool that has been on the market since the 1990s. It combines classic segmentation with a project-orientated working method – incorporating translation memory, terminology management, preview options, quality checks and various functions designed specifically for technical documentation.
The extensive and growing portfolio of AI features, which are demonstrated in a series of short videos on our YouTube channel, are not to be missed.

5 reasons why so many professionals have put their trust in Transit NXT for years

1. Up-to-date and contextualised terminology

Transit works seamlessly with TermStar. Live terminology entries are displayed to translators within the editor itself – including the definition, context and source of the term. This extensive integration is a clear benefit over those tools where the terminology often features only in the sidelines.

2. Project structure, not file chaos

Unlike other CAT tools, Transit thinks in terms of projects with a clear-cut file structure. This takes the hard work out of managing big or lengthy translation projects – especially when it comes to regular updates or complex workflows.

3. Need technical formats? No problem with Transit.

Whether DITA, XML, FrameMaker, InDesign or XLIFF – Transit leads the way when it comes to the variety of natively supported file formats. Many other tools need extra modules or conversions to handle these files.

4. Local installation – full data sovereignty

Transit NXT works entirely locally – without any cloud obligations. For companies that have high data protection requirements, this is a crucial advantage over cloud-based solutions.

5. Quality assurance at the highest level

With automated checks, an in-context preview function and variant check, Transit NXT offers precise quality management for an impressive level of efficiency that is especially beneficial for those handling technical content.

 

Transit Software Bedienoberfläche

Who is Transit most suited to?

  • Technical translators working with complex formats.
  • Public authorities, industrial companies and service providers who need to keep sensitive data locally.
  • Freelancers who attach great importance to a reliably maintained terminology.
  • Translation agencies that want an efficient tool for managing large structured projects.

Sound good?

Transit NXT is no entry-level tool – but that is precisely what makes it a great option for anyone who values structure, terminology and format variety.

If you want to see for yourself how Transit works, simply request a non-binding trial version now.

STAR is a certified SCHEMA ST4 translation service provider

Posted on: May 30th, 2025 by Frank Wöhrle No Comments

We have successfully completed the training to become a certified translation service provider for the SCHEMA ST4 content management system. As such, STAR Deutschland is now an official certified translation service provider for SCHEMA ST4.

What is SCHEMA ST4?

SCHEMA ST4 is a professional content management system that more and more companies are turning to when producing technical documentation. It assists users in the creation, management and publication of multilingual product documentation (manuals, instructions, catalogues, online guides, etc.).

SCHEMA ST4 is an XML-based editing system that separates the layout from the textual content. In technical documentation, this is very beneficial when reusing text fragments and when managing multiple languages and versions.

SCHEMA ST4 finds application in a broad spectrum of industries, e.g. in the automotive sector, in mechanical and plant engineering or in pharmaceuticals. One major benefit of this system lies in the extensive optimisation of the translation process, which in turn reduces costs.

Training content and key training topics

The “Translation Management” training programme covers the various steps of the translation process, namely:

  1. Selecting the right text fragments
  2. Exporting the text content for translation, if necessary using COTI
  3. The subsequent import of the translated content into SCHEMA ST4

The training also offers insights into potential challenges that may be encountered, both in terms of the editing and the translation.

Translation process for SCHEMA ST4 content

The SCHEMA ST4 content management system is one of the most frequently implemented solutions in technical editing among STAR’s customers.

Let us assist you with our in-depth knowledge of the SCHEMA ST4 translation interface and the related processes.

Get in touch now with no obligation!

Translate more efficiently with AI – but how?

Posted on: February 26th, 2025 by Frank Wöhrle No Comments

AI: What started as a buzzword, and then became an established term in everyday language, is now a basic requirement for many applications and processes. And this technology is not stopping at the language industry either. Since the launch of ChatGPT we know that translating can now also be completely interactive. Large Language Models, also know as LLMs, in chatbot form are now flooding the market. It feels as though a new model is popping every week, announcing its intention to outdo its competitors in terms of efficiency, quality and reliability. Neural machine translation (NMT) doesn’t seem that old – and yet we are already discussing when this technology will disappear from the market and be replaced by generative AI.

The key question is: I want to translate more efficiently with AI – but how?

AI for targeted optimisation of translation quality

Even though the technology has made significant progress over the last five years, the results of the commonly used and established NMT systems are not always good enough. This can have a variety of causes:

  • The desired language combination has not been trained with sufficient material or goes via a pivot language (often English). This can lead to structural problems or errors in meaning.
  • The MT system does not recognise specialist or customer-specific terminology.
  • The MT system was used for content in which style is extremely important or the translation needs to be targeted towards a specific target group.


Manuals, marketing texts or content with high customer visibility therefore often do not achieve the desired levels of quality through machine translation alone. Language professionals then optimise the machine-generated texts as part of a post-editing process. Machine translations are carefully checked, compared with the source text and corrected if necessary.

As a central translation platform, the CAT tool enables users to work efficiently and offers targeted support for quality assurance thanks to a range of automated features. But where exactly is AI being used here? LLMs such as ChatGPT from OpenAI are perfectly capable of producing translations that, like DeepL or Google Translate, provide a good starting point for further processing, depending on how it is to be used.

However, a significant leap in quality can be achieved by improving the translation requests through the targeted use of prompts and the addition of reference files. To achieve this, however, in addition to a well thought-out prompt engineering design, validated translation resources in the form of translation memory and terminology databases are a fundamental prerequisite.

 

AI for better translation resources

As with any new technology, a question often arises: What can AI do for me?

However, if you want to integrate AI into your language processes in the long term, you should first ask yourself: What can I do for AI?

Well-maintained translation resources make a significant contribution to improving the results of your AI solution. Take the topic of terminology, for example. If you use a generic system such as DeepL for your translation processes, you will receive translations that do not match your company terminology – unless you integrate a glossary.

Are you only at the stage of establishing your terminology but don’t want to miss out on the benefits of MT? Use language models to extract potential terminology from your monolingual or multilingual documents. You can also use AI to check your translation memory databases, for example to find inconsistent translations or to automate clean-up or correction across large data sets. Use these resources consistently to increase the translation quality of your language model or improve the output of NMT systems.

AI as co-pilot? Reach your destination safely with the new STAR webinar series

As you can see, we are extremely enthusiastic about the topic of AI – and we don’t claim to be reinventing the wheel. However, the technology offers a lot of potential for optimisation if it is used efficiently and sustainably.

Of course, we would like to share our enthusiasm with you and invite you to our webinar series “AI as co-pilot: Forging new paths to smart language processes” starting in March. All the webinars will be held in German only.


How exactly does generative AI actually work? What advantage does it offer for the translation? How can I use language models to create product texts? Can I train my own AI? And what actually happens to my data?

Our Language Technology Consultant Julian Hamm answers these and many other questions and discusses the many different uses of generative AI, including translation, terminology management, content creation and content delivery. You can expect the following content in the first group of topics:

  • Was ist generative KI, und wofür kann ich sie einsetzen? (What is generative AI and what can I use it for?)
  • Wie kann KI bei der Übersetzung unterstützen? (How can language experts benefit from AI?)
  • Wie kann ich KI für die Terminologiearbeit einsetzen? (How can I use AI for terminology work?)
  • Welche Vorteile bietet KI für die Content-Erstellung? (What advantages does AI offer for content creation?)


Further information on the events and the registration form can be found here.

We look forward to you joining us!

Transit NXT Service Pack 17: New AI functions and more

Posted on: January 31st, 2025 by Frank Wöhrle No Comments

With the latest Transit NXT Service Pack, you can benefit from a host of new features to speed up your processes.

New file formats

Transit NXT has been once again expanded to include the very latest file formats with Service Pack 17. Documents from InDesign 2025 and drawings up to AutoCAD 2025 can now also be translated as optional file versions. Documents from the Google Docs Editors Suite are now also officially supported as another file format.

Machine translation

Integration of Amazon Translate.

With the addition of another MT system, translations can now also be requested via Amazon Translate – interactively directly in the translation editor, or as an option automatically when importing the project. New functions are also available for DeepL, Systran and Textshuttle.

Project management

Professional support in the editor thanks to integrated MS Word grammar check, AutoCorrect and AutoComplete functions.

TermStar

For TermStar, we are focusing on terminology export for this Service Pack: For one, TBX version 3 is now officially supported. What’s more, this Service Pack also makes it possible to export multimedia files (e.g. graphics or videos) from dictionaries to certain formats.

New editor functions

Translators can look forward to additional helpful editor functions:

The AutoComplete function makes it quicker to enter words and phrases with project-specific suggestions from dictionaries and translation memories.

AutoCorrect corrects typical typos, typographically converts quotation marks and makes it possible to use shortcuts to enter special characters and frequently used phrases. Date and number formats as well as alphanumeric strings can now be adapted to the target language format with a simple mouse click.

For quality assurance, the translation can now also be checked for correct grammar and corrected interactively. What’s more, Russian for Kazakhstan is now available as an additional working language.

Follow this link to download the Service Pack

You can find more information at: https://www.star-deutschland.net/en/technology-and-software/software-products/

2024 – the year of AI: Important developments and lessons learned

Posted on: December 16th, 2024 by Frank Wöhrle No Comments

Another year is drawing to a close, and we can hardly believe how fast the time has flown by. Now is a good opportunity to take a look back at all of the important developments that 2024 – the year of AI – has brought us, and give you an insight into what next year has in store for us.

AI has been a hot topic ever since OpenAI stunned the whole world with ChatGPT. Companies are increasingly insisting on using AI wherever this seems possible. From many discussions and exciting customer projects over the course of the year, we have identified key lessons learned and trends in this field.

Five key trends relating to the use of AI in the context of translation

  • Expectations for generative AI remain very high.
    However, the purposes for which people want to use it differ greatly, especially in language processes: From the fanciful idea of a wonder machine which produces, translates and optimises texts so they are perfect, through to a clever tool that provides targeted assistance with specific tasks that are usually performed manually at present. The increasing integration of large language models into translation processes makes exactly this possible by providing these with targeted and modular support. This ranges from the bilingual extraction of terminology and the post-editing of machine-translated content, through to assessing the quality of multilingual documents.
  • If you want to use the terminology efficiently and sustainably, you also need high-quality, well-structured language resources to be able to supply the language models with relevant information.
    This means that years of working with translation memory and terminology management systems now offers double the benefits. If this data is prepared in a structured and sustainable manner, language models can use it to optimise machine-translated content, for instance in the form of retrieval-augmented generation (RAG).
  • The topic of data protection continues to generate extreme uncertainty despite the adoption of the EU AI Act in May 2024.
    Many companies are looking for ways to use AI in the most secure possible way in order to protect their precious data against misuse.
  • A lot of businesses are experiencing issues with the scalability of AI solutions, whether this concerns the IT infrastructure, financial resources or further training of staff.
  • Human in the cockpit. People will increasingly return to the centre of the AI-based translation workflow.
    While translators were previously responsible for the post-editing of predefined machine-translated content, among other tasks, as part of human in the loop concept, the new human in the cockpit principle aims for translators to use modern language technologies – even interactively – in order to exert their own influence on the output and to create efficient design processes.
    The technological transformation is also resulting in changing requirements for current and future language experts. The relevant universities have also recognised these developments and are revising the degrees and courses they offer accordingly. For instance, prompt engineering, language technologies and information management are important focal topics that will feature more often on the curriculum in future.

Are you interested in this subject? Then don’t miss our STAR webinar, which is scheduled for early 2025. There, we will be sharing information about current trends and our latest technological developments.

Certified processes: The foundations for AI integration you can trust

Posted on: November 25th, 2024 by Frank Wöhrle No Comments

This year, STAR Deutschland GmbH once again welcomed its independent certification partner LinquaCert to its Sindelfingen office for the ISO 18587:2017 surveillance audit (“Post-editing of machine translation output”) shortly before the annual tekom conference. We are pleased to confirm the successful recertification in line with this standard that relates explicitly to quality assurance in the production of machine translations.

Spotlight on terminology integration and automation in quality assurance when incorporating AI into translation processes

As well as active discussions on qualifications, training measures and quality measures, there was once again a real need to discuss the integration of generative AI into translation processes.
The spotlight was shone primarily on the topics of terminology integration and automation in quality assurance that provide more tailored support to the linguists delivering MT post-editing projects and are designed to reduce the processing effort.
As a longstanding technology partner and language service provider, we embrace current trends and give our translators the expertise they need to be able to work efficiently and in a future-oriented way.

Missed tekom, but want to know more about AI? Our STAR webinar exploring “Augmented Translation” offers a quick insight into the latest developments in language technologies. Register here to be sent the webinar recording: https://www.star-deutschland.net/en/language-management-consulting/machine-translation-and-post-editing/star-webinar-augmented-translation/ 

Want to get the most out of modern language technologies and are committed to delivering high-quality translations? We have the right service for you: https://www.star-deutschland.net/en/language-management-consulting/machine-translation-and-post-editing/