llms.txt

HomeGlossarllms.txt

llms.txt ist ein vorgeschlagener Standard für eine Textdatei im Root-Verzeichnis einer Website, die großen Sprachmodellen (LLMs) hilft, die Inhaltsstruktur der Website zu verstehen, maßgebliche Seiten zu identifizieren und Informationen korrekt zu verwenden. Es ist keine bestätigte universelle Anforderung für AI-Search-Sichtbarkeit.

What llms.txt is

llms.txt is a plain-text file placed at the root of a website — for example, yourdomain.com/llms.txt. It is designed to give large language models structured guidance about what the site contains, which pages are most authoritative, and how the information should be interpreted.

The specification was proposed by Jeremy Howard and is maintained at llmstxt.org. Its authors describe it as a proposal to standardise a file that helps LLMs use website information at inference time.

llms.txt vs robots.txt — key difference

robots.txt tells crawlers which pages they can or cannot access.

llms.txt tells language models which content is most important and how to interpret the site — it is a guidance file, not an access control file.

Both files can coexist and serve different purposes.

What a basic llms.txt contains

A minimal llms.txt file typically includes a site name, a short description, and links to the most authoritative pages:

Grupa Insight

Full-stack digital agency specializing in AI Search Optimization, headless architecture, and performance SEO.

Key pages

Is llms.txt required for AI Search visibility?

No. There is currently no verified public evidence that llms.txt is a universal requirement for visibility in ChatGPT Search, Google AI Overviews, Gemini, or Perplexity.

It is best treated as an experimental AI-readiness layer that may help document your most authoritative content — implemented alongside robots.txt, sitemap.xml, schema markup, and strong internal linking, not as a replacement for them.

Source

The llms.txt specification is maintained at llmstxt.org and described by its authors as a proposal to standardise.