llms.txt: The robots.txt for AI

By Reed

In 1994, a Dutch software engineer named Martijn Koster had a problem. Web crawlers were hammering his server, and there was no way to tell them which pages to index and which to leave alone. His solution — a simple text file called robots.txt — became one of the most universally adopted standards on the web.

Thirty years later, we're in the same spot. Except this time, it's not search engine crawlers knocking on the door. It's AI agents.

The problem

When an AI coding assistant needs to understand your API, it doesn't browse your docs like a human would. It needs structured, machine-readable content it can pull into its context window. But there's no standard way to say: "Here's what my project is about, and here's the important stuff."

Most AI tools end up scraping HTML, guessing at navigation structure, or relying on whatever fragments ended up in their training data. The result is hallucinated APIs, outdated examples, and developers who stop trusting the AI's output.

Enter llms.txt

llms.txt is a proposed standard that solves this. It's a plain text file (served at /llms.txt) that gives AI agents a structured overview of your project:

# My Project

> A brief description of what this project does.

## Docs

- [Getting Started](/docs/getting-started.md)
- [API Reference](/docs/api.md)
- [Configuration](/docs/config.md)

## Optional

- [Changelog](/CHANGELOG.md)
- [Contributing](/CONTRIBUTING.md)

No schema to learn, no build step, no complex configuration. Just a text file that tells AI agents what's important and where to find it.

There's also llms-full.txt — a single file containing your entire documentation as plain text. Useful for tools that want to ingest everything at once.

Why now

AI-assisted development is crossing a threshold:

  • Cursor, Windsurf, Claude Code and others are becoming primary interfaces for reading documentation
  • MCP (Model Context Protocol) is standardizing how AI tools connect to external data sources
  • RAG pipelines are how enterprises are making AI useful with their own content

If your project doesn't have an llms.txt, you're relying on AI tools to figure out your docs by scraping HTML and hoping for the best. That's the equivalent of not having a robots.txt in 1996 — things still work, but you're leaving discoverability on the table.

The catch

Writing and maintaining an llms.txt by hand is tedious. You have to keep it in sync with your actual docs, update it when you add or rename a page, and include the right level of detail without overwhelming the context window.

For llms-full.txt, it's worse — you need to concatenate all your documentation into a single, well-formatted text file and regenerate it on every change.

Most teams won't bother.

How we handle it

Every site hosted on denote.cloud gets llms.txt and llms-full.txt generated automatically. No configuration, no extra build step. Your docs change, the files update.

We also ship a built-in MCP server and a JSON API on every site — so AI tools can query your docs as a structured tool, not just read a static file. All of this is on the free tier, because AI-readability should be table stakes.

The engine behind all of this is open source. No black boxes.

Join the waitlist →