The llms.txt standard provides AI assistants with a concise summary of your site's content. Think of it as robots.txt for AI inference—not blocking access, but guiding AI to your most useful documentation.
Jeremy Howard proposed the standard in September 2024. Unlike robots.txt, llms.txt uses Markdown and targets AI tools at inference time rather than crawlers at training time.
Nuxt has dedicated support via nuxt-llms—automatic generation from your config or content.
Install the module:
npx nuxi module add nuxt-llms
Configure in nuxt.config.ts:
export default defineNuxtConfig({
modules: ['nuxt-llms'],
llms: {
domain: 'https://example.com',
title: 'My Nuxt App',
description: 'Documentation for My Nuxt App',
sections: [
{
title: 'Getting Started',
links: [
{ title: 'Installation', href: '/docs/installation' },
{ title: 'Configuration', href: '/docs/configuration' }
]
}
]
}
})
The module generates /llms.txt and /llms-full.txt automatically at build time.
llms.txt solves a specific problem: LLM context windows are too small to process entire websites. Your Nuxt documentation might be thousands of pages, but an AI assistant needs a curated entry point.
llms.txt is useful for:
llms.txt is overkill for:
| Feature | robots.txt | llms.txt |
|---|---|---|
| Purpose | Block/allow crawling | Guide AI to useful content |
| Format | Custom syntax | Markdown |
| When used | Training data collection | Inference (answering questions) |
| Crawler support | All major crawlers | Limited—mostly AI coding tools |
| Required | No | No |
Important: AI crawlers don't currently request llms.txt during inference. GPTBot, ClaudeBot, and PerplexityBot use pre-built datasets and respect robots.txt, not llms.txt.
The primary use case today is AI coding assistants (Cursor, Claude Code) and MCP servers that explicitly fetch llms.txt to understand project documentation.
llms.txt uses a structured Markdown format. Only the H1 title is required—everything else is optional.
# Nuxt SEO Documentation
> Complete SEO toolkit for Nuxt applications. Handles sitemaps, robots.txt, OG images, and schema.org.
Key modules: sitemap, robots, og-image, schema-org, seo-utils.
## Getting Started
- [Installation](https://nuxtseo.com/docs/nuxt-seo/getting-started/introduction): Install the full Nuxt SEO module
- [Site Config](https://nuxtseo.com/docs/site-config/getting-started/introduction): Configure your site URL and name
## Modules
- [Sitemap](https://nuxtseo.com/docs/sitemap/getting-started/introduction): Automatic sitemap generation
- [Robots](https://nuxtseo.com/docs/robots/getting-started/introduction): robots.txt and meta robots
- [OG Image](https://nuxtseo.com/docs/og-image/getting-started/introduction): Dynamic social images
## Optional
- [Changelog](https://github.com/harlan-zw/nuxt-seo/blob/main/CHANGELOG.md)
H1 Title — The only required element. Name of your project or site.
Blockquote — Brief summary with key information for understanding the rest of the file.
Body content — Paragraphs, lists, or any Markdown except headings. Provides context about the project.
H2 sections — File lists with links to detailed documentation. Each entry is a Markdown link with optional description:
- [Link Text](https://url): Optional description of what this page covers
Optional section — An H2 titled "Optional" marks content that AI can skip if context is limited.
Define sections explicitly when you want full control:
export default defineNuxtConfig({
modules: ['nuxt-llms'],
llms: {
domain: 'https://example.com',
title: 'My Nuxt App',
description: 'A Nuxt 3 application with TypeScript.',
sections: [
{
title: 'Documentation',
links: [
{ title: 'API Reference', href: '/docs/api' },
{ title: 'Components', href: '/docs/components' },
{ title: 'Getting Started', href: '/docs/setup' }
]
},
{
title: 'Optional',
links: [
{ title: 'Changelog', href: '/changelog' }
]
}
]
}
})
If you're using Nuxt Content, nuxt-llms can generate sections from your content automatically:
export default defineNuxtConfig({
modules: ['@nuxt/content', 'nuxt-llms'],
llms: {
domain: 'https://example.com',
title: 'My Docs',
// Automatically generates from /content/docs
content: {
collections: ['docs']
}
}
})
The module generates both /llms.txt (concise) and /llms-full.txt (comprehensive) automatically. AI tools can choose based on their context limits.
export default defineNuxtConfig({
llms: {
// ... config
full: {
// Include full page content in llms-full.txt
includeContent: true
}
}
})
https://yoursite.com/llms.txtThere's no official validator yet, but the llms-txt-hub directory lists sites implementing the standard.
The standard has 2,000+ GitHub stars and growing adoption among documentation sites. Notable implementers include Answer.AI, fast.ai, and various open source projects.
However, adoption by AI providers is limited. As of late 2025:
The spec is still emerging. Implementing llms.txt today is forward-looking—it positions your docs for better AI integration as adoption grows.
llms.txt complements Generative Engine Optimization but serves a different purpose:
| GEO | llms.txt |
|---|---|
| Optimizes content for AI citations | Provides structured entry point to docs |
| Targets AI search (ChatGPT, Perplexity) | Targets AI coding tools and MCP servers |
| Uses schema.org, content structure | Uses Markdown file format |
| Improves visibility in AI responses | Improves AI understanding of your project |
For maximum AI visibility, implement both:
For comprehensive AI and search optimization, use the full Nuxt SEO module:
This includes automatic schema.org, sitemaps, robots.txt, and OG images—all signals that help both traditional search and AI search understand your content.
Duplicate Content
Duplicate content wastes crawl budget and splits ranking signals. Here's how to find and fix it with canonical tags, redirects, and parameter handling.
Routes & Rendering
URL structure and rendering mode determine whether search engines can crawl, index, and rank your Nuxt application.