$ nuxt-seo tools
Generate robots.txt with AI crawler presets. Test rules instantly.
# Generated by Nuxt SEO
# https://nuxtseo.com/tools/robots-txt-generator
User-agent: *
Allow: /
User-agent: *applies to all crawlers
Disallow: /block entire site
Allow: /explicitly allow (for exceptions)
Crawl-delay: 10wait 10s between requests (Bing/Yandex)
Sitemap: URLspecify sitemap location
Uses y/n values
Content-Usage: search=yallow search indexing
Content-Usage: train-ai=ndisallow AI model training
search=y, train-ai=n Uses yes/no values
search=yesallow search indexing
ai-input=nodisallow live AI answers
ai-train=nodisallow model training
*matches any sequence
$matches end of URL
/*.pdfall .pdf files
/*.php$URLs ending in .php
Disallow: /admin/Block /admin/ directoryDisallow: /*?Block URLs with query stringsDisallow: /*.json$Block all .json filesDisallow: /private/*Block everything under /private/Allow: /api/publicAllow specific path (exception)$ top AI crawlers to block
GPTBotClaudeBotCCBotGoogle-ExtendedBytespiderPerplexityBotGoogle-Extended = Gemini training (not Search)
$ content preference headers
Content-Usage: search=y, train-ai=nContent-Signal: search=yes, ai-train=noai-input = live answers ai-train = model training
Generate robots.txt rules dynamically based on routes, environments, or user conditions with the Nuxt Robots module.
Explore Nuxt Robots