27 Feb 2026

Quick wins for technical writing – how local LLMs can speed up your daily work

Find out why local large language models (LLMs) are considered an insider tip for technical writing and how you can use AI to automate routine tasks.

Is your company yet to discover knowledge-based chatbots? Do you want to ensure that your sensitive data does not leave the computer? If so, we have a host of practical tips to get you off to a good start.

Why a local LLM makes sense for technical writers

Whether product changes, API updates or new features – technical documentation often needs to be adapted at short notice. This is precisely where local LLMs such as Ollama come into play:

  • Data sovereignty: Your content remains in-house.
  • Offline capability: It can be used even without an internet connection.
  • Cost-effectiveness: No ongoing cloud fees.
     

Please note: Ensure that you collaborate with your IT security department with regard to installation and configuration – safety first!

A number of suitable local LLM environments are now available that can be used to perform simple tasks in the technical writing office. We took a closer look at the Ollama platform.

 


Ollama at a glance – the local LLM platform

Screenshot LLama Software

 

Who is behind it?

Ollama is an open-source platform for running large language models (LLMs) locally. It was developed by Jeffrey Morgan (CEO) and Michael Chiang, the brains behind Kitematic, now part of Docker.

Over 156,000 GitHub stars (as of 2025) show that the project is growing rapidly. Supported by Y Combinator, Ollama remains community-driven. The platform enables you to run various LLMs locally on your computer (Windows, MacOS, Linux). Unlike cloud-based services, your data remains under your control.

Advantages of Ollama

  • No cloud uploads of confidential data
  • Integration of existing reference documents
  • Rapid, repeatable text generation for manuals, API documentation or help files
     

Recommended models

  • Llama3 / Llama3.1 – good balance of size and speed
  • Mistral – ideal for short, precise sections of text
  • Gemma3 – strong at text suggestions and summaries
     

After Ollama has been launched for the first time, the model is loaded locally. It is easy to install using an installer, and the application can then be launched via a user interface or from the CMD/Powershell console.

 


Get your local LLM ready to go in just a few steps

1. Installation

Download Ollama at https://ollama.com/download and follow the installation instructions. Afterwards, the loveable llama will welcome you.

Screenshot LLama Software

 

2. Download a suitable model

After installation, you can download a local model that is suitable for technical documentation or use a model that is already installed.

Open Ollama, click on “Download” and select, for example, llama3.2, gemma3:4b or mistral.

Screenshot LLama Software

Alternatively, you can download additional models via the console in Windows (CMD or Powershell):

To do this, use the command: ollama pull llama3:8b

Screenshot LLama Software

Wait until the model is fully downloaded – and you’re all set.

 


Hands-on: How to create a new section of text based on a reference document

Starting point

  • Reference manual (e.g. Word or PDF) with older version of the documentation is available.
  • New features or modified technology that require the document to be revised or reissued.
     

How it works: Update your manual efficiently – with local support

  • Content reuse: Analysis of existing reference documents (PDF, DOCX) with LLM
  • Structuring: Automatic generation of headings, lists and paragraphs
  • Terminology harmonisation: Matching of technical terminology with targeted prompts

 

Practical tips: Prompt engineering for local LLMs made easy

The key to success lies not in the model itself, but in something known as “prompt engineering”. Since we are using a relatively small local model compared to GPT-4, we need to be very precise here. For our example scenario, we would like to create the new function “Comments at segment level” based on an existing manual.

Prompt for creating a chapter

We came up with the following prompt for our enquiry. It is important to refer to the reference document, which you can insert using drag and drop.

Screenshot LLama Software

And the likeable llama replies:

Screenshot LLama Software

 

Example prompts for further useful queries

# Summary of content
“Summarise the contents of this file in 10 bullet points.”

# Creation of a new section
llama3 model: “Create a new section for the ‘Batch Export’ feature in the style of the reference document. Use short, clear sentences and a level 3 heading. Focus on step-by-step instructions.”

# Formatting adjustments
“Convert all step lists in the following text into numbered lists. Keep the technical terms, but simplify the wording slightly.”

Our tip: Always provide the reference document – this ensures that style and structure remain consistent.

 

Saving and post-processing

Even good AI needs editorial fine tuning:

  • Check the technical accuracy of the generated content.
  • Adapt the wording to your editorial style.
  • Add screenshots or diagrams if necessary.
  • Test the steps described to ensure they are correct.
  • Try out different models that suit your requirements.

 

Summary: Data protection meets productivity

Local LLMs such as Ollama open up new possibilities for creating technical documentation – without any dependence on the cloud. With precise prompt engineering and targeted post-processing, you can achieve significant time savings without compromising data protection and data sovereignty. Give it a try: you’ll be surprised at how quickly you achieve initial results.

 


More quick wins for technical writing

Scope of applicationDescription
Automatic structuringLong passages of continuous text are separated into clear headings, lists and paragraphs.
Template creationA reference document is used to create a standardised template (e.g. consistent structure for “Function description”, “Prerequisites”, “Examples”).
Linguistic harmonisationAll sections are harmonised to ensure a consistent style and tone (e.g. informal vs formal, passive vs active).

 


What you should bear in mind with local LLMs

Technical limitations

Context window:

  • Problem: Large documents (> 50 pages) → Possible loss of information during processing by the LLM
  • Solution: Chapter-by-chapter processing
     

Risk of “hallucinations”:

  • Problem: Fictitious technical details in the generated content
  • Solution: Prompt modification with restrictions, e.g. “Only change the explicitly highlighted area,” “Do not invent technical specifications.”

 

Compliance

  • GDPR: Compliance guaranteed through on-premises operation
  • Security audit: Have your IT security team conduct risk analysis prior to implementation
     

 


Local LLMs: Limits today – prospects for tomorrow

When local systems reach their limits – for example, when it comes to team-wide collaboration or larger volumes of documentation – the next step is clear: an integrated, cloud-based solution.

Get in touch with us – we’d be happy to show you how to optimally integrate AI into your technical writing processes.

27 Nov 2025

E-learning is considered a central pillar of continuing professional development in many companies – from global onboarding courses to complex product training. At the same time, projects repeatedly face similar challenges.

30 Oct 2025

Terminology work has always been the key to consistent, high-quality translations. The advent of AI and large language models (LLMs) is fundamentally changing translation processes.

7 Oct 2025

The world’s biggest conference for technical communication will be held in Stuttgart from 11th to 13th November.
Visit us at tekom in the foyer at stand 21

9 Sep 2025

With Transit NXT Service Pack 18, the STAR Group is introducing a powerful extension to its translation memory system – a real game changer for professional translators and project managers!

28 Jul 2025

In less than 4 months, the next tekom annual conference is set to open in Stuttgart. The world’s biggest conference for technical communication will be held in Stuttgart from 11th to 13th November.

8 Jul 2025

This year’s MT Summit was held in Geneva, Switzerland, and featured a diverse programme of tutorials, workshops and inspiring presentations on the topics of machine translation (MT) and large language models (LLMs).

2 Jul 2025

Anyone who regularly works with CAT tools (computer-aided translation software) probably thinks of Trados Studio, memoQ or Across first. One name is often overlooked – and unfairly so: Transit NXT

30 May 2025

We have successfully completed the training to become a certified translation service provider for the SCHEMA ST4 content management system. As such, STAR Deutschland is now an official certified translation service provider for SCHEMA ST4.

26 Feb 2025

AI: What started as a buzzword, and then became an established term in everyday language, is now a basic requirement for many applications and processes.

31 Jan 2025

With the latest Transit NXT Service Pack, you can benefit from a host of new features to speed up your processes.

16 Dec 2024

Another year is drawing to a close, and we can hardly believe how fast the time has flown by. Now is a good opportunity to take a look back at all of the important developments that 2024 – the year of AI – has brought us, and give you an insight into what next year has in store for us.

28 Oct 2024

Can AI help to create high-quality content in any language while adhering to corporate language and specific rules?

30 Sep 2024

It’s that time again. Europe’s largest conference for technical communication, tekom, will take place in Stuttgart from 5th to 7th November. Visit us in hall C2 at stand 2D13 and find out more about our language services, enterprise technologies and all the latest developments.

1 Aug 2024

In the fast-paced world of the translation and localisation industry, efficiency is the key to success. One solution that can play an important role in delivering this efficiency is the Common Translation Interface (COTI) standard.

18 Apr 2024

Multilingualism and SEO go hand in hand – when a professional is at the helm. STAR Deutschland and our colleagues from netzgefährten online marketing agency would like to invite you on a journey through the world of SEO.

11 Mar 2024

Large language models (LLMs) could prove to be valuable assets for linguists in the context of language processes. But what exactly are the advantages of this technology?

27 Feb 2024

Few words have characterised the year 2023 as much as “AI”.
But what does this buzzword actually mean for translation and language processes?

16 Nov 2023

The new version of the translation memory system Transit NXT offers additional filters, enhanced web search services, more connections and numerous improvements in terms of usability.

26 Oct 2023

Visit us at the tekom annual conference in Stuttgart!

Find us on stand 2D38 in Hall C2 from 14th to 16th November.
There, you’ll find out more about our language services, enterprise technologies and all the latest developments.

30 Mar 2022

Translating with pen and paper? Dragging around dictionaries and printing out terminology lists? If you hear those questions and picture yourself back at school before the new millennium, perhaps you will feel it even more keenly when you learn how the translation process has changed since that time thanks to the introduction of modern technology.

19 Jan 2022

Service Pack 14 and the latest plug-ins now support file formats for AutoCAD 2021 & 2022, InDesign 2021 and FrameMaker 2020.

23 Nov 2020

STAR proudly presents: Our colleagues, Birgit M. Hoppe and Birgitta Geischberg have added to the existing work by Jean-Marc Dalla-Zuanna and Dr. Christopher Kurz with their article “Quality of terminology processes in corporate contexts/Agreeing and harmonizing terminology”.

11 Feb 2020

Standardised, manufacturer-independent reference terminology for the automotive industry is now available. It came about as part of an EU project with the goal of facilitating access to vehicle manufacturers’ repair and maintenance information (RMI) by “authorised independent operators” (independent workshops, testing institutes, etc.) (EN ISO 18542 standard).

Subscribe to newsletter

Receive regular bulletins about STAR products, services, innovations and much more.