Config using Robots.txt · Nuxt Robots · Nuxt SEO

[NuxtSEO](https://nuxtseo.com/ "Home")

- [Modules](https://nuxtseo.com/docs/nuxt-seo/getting-started/introduction)
- [Tools](https://nuxtseo.com/tools)
- [Pro](https://nuxtseo.com/pro)
- [Learn SEO](https://nuxtseo.com/learn-seo/nuxt) [Releases](https://nuxtseo.com/releases)

[1.4K](https://github.com/harlan-zw/nuxt-seo)

[Nuxt SEO on GitHub](https://github.com/harlan-zw/nuxt-seo)

[User Guides](https://nuxtseo.com/docs/robots/getting-started/introduction)

[API](https://nuxtseo.com/docs/robots/api/use-robots-rule)

[Releases](https://nuxtseo.com/docs/robots/releases/v6)

Robots

- [Switch to Robots](https://nuxtseo.com/docs/robots/getting-started/introduction)
- [Switch to Nuxt SEO](https://nuxtseo.com/docs/nuxt-seo/getting-started/introduction)
- [Switch to Sitemap](https://nuxtseo.com/docs/sitemap/getting-started/introduction)
- [Switch to OG Image](https://nuxtseo.com/docs/og-image/getting-started/introduction)
- [Switch to Schema.org](https://nuxtseo.com/docs/schema-org/getting-started/introduction)
- [Switch to Link Checker](https://nuxtseo.com/docs/link-checker/getting-started/introduction)
- [Switch to SEO Utils](https://nuxtseo.com/docs/seo-utils/getting-started/introduction)
- [Switch to Site Config](https://nuxtseo.com/docs/site-config/getting-started/introduction)
- [Switch to Skew Protection](https://nuxtseo.com/docs/skew-protection/getting-started/introduction)
- [Switch to AI Ready](https://nuxtseo.com/docs/ai-ready/getting-started/introduction)

Search…```k`` /`

v6.0.6

- Playgrounds
- [Discord Support](https://discord.com/invite/275MBUBvgP)

### Getting Started

- [Introduction](https://nuxtseo.com/docs/robots/getting-started/introduction)
- [Installation](https://nuxtseo.com/docs/robots/getting-started/installation)
- [Troubleshooting](https://nuxtseo.com/docs/robots/getting-started/troubleshooting)

### Core Concepts

- [Disabling Site Indexing](https://nuxtseo.com/docs/robots/guides/disable-indexing)
- [Disable Page Indexing](https://nuxtseo.com/docs/robots/guides/disable-page-indexing)
- [How Nuxt Robots Works](https://nuxtseo.com/docs/robots/guides/how-it-works)
- [Robots.txt Recipes](https://nuxtseo.com/docs/robots/guides/robot-recipes)
- [Config using Robots.txt](https://nuxtseo.com/docs/robots/guides/robots-txt)
- [Config using Nuxt Config](https://nuxtseo.com/docs/robots/guides/nuxt-config)
- [Config Using Route Rules](https://nuxtseo.com/docs/robots/guides/route-rules)
- [Bot Detection](https://nuxtseo.com/docs/robots/guides/bot-detection)
- [AI Directives](https://nuxtseo.com/docs/robots/guides/ai-directives)

### Advanced

- [Yandex: Clean-param](https://nuxtseo.com/docs/robots/advanced/yandex)
- [Nuxt Content](https://nuxtseo.com/docs/robots/advanced/content)
- [Nuxt I18n](https://nuxtseo.com/docs/robots/advanced/i18n)

Core Concepts

# Config using Robots.txt

[Copy for LLMs](https://nuxtseo.com/docs/robots/guides/robots-txt.md)

## [Introduction](#introduction)

The [robots.txt standard](https://developers.google.com/search/docs/crawling-indexing/robots/create-robots-txt) is important for search engines to understand which pages to crawl and index on your site.

New to robots.txt? Check out the [Robots.txt Guide](https://nuxtseo.com/learn-seo/nuxt/controlling-crawlers/robots-txt) to learn more.

To match closer to the robots standard, Nuxt Robots recommends configuring the module by using a `robots.txt`, which will be parsed, validated, configuring the module.

If you need programmatic control, you can configure the module using [nuxt.config.ts](https://nuxtseo.com/docs/robots/guides/nuxt-config), [Route Rules](https://nuxtseo.com/docs/robots/guides/route-rules) and [Nitro hooks](https://nuxtseo.com/docs/robots/nitro-api/nitro-hooks).

## [Creating a `robots.txt` file](#creating-a-robotstxt-file)

You can place your file in any location; the easiest is to use: `<rootDir>/public/_robots.txt`.

Additionally, the following paths are supported by default:

Example File Structure

```
# root directory
robots.txt
# asset folders
assets/
├── robots.txt
# pages folder
pages/
├── robots.txt
├── _dir/
│   └── robots.txt
# public folder
public/
├── _robots.txt
├── _dir/
│   └── robots.txt
```

### [Custom paths](#custom-paths)

If you find this too restrictive, you can use the `mergeWithRobotsTxtPath` config to load your `robots.txt` file from any path.

```
export default defineNuxtConfig({
  robots: {
    mergeWithRobotsTxtPath: 'assets/custom/robots.txt'
  }
})
```

## [Parsed robots.txt](#parsed-robotstxt)

The following rules are parsed from your `robots.txt` file:

- `User-agent` - The user-agent to apply the rules to.
- `Disallow` - An array of paths to disallow for the user-agent.
- `Allow` - An array of paths to allow for the user-agent.
- `Sitemap` - An array of sitemap URLs to include in the generated [sitemap](https://nuxtseo.com/docs/sitemap/getting-started/introduction).
- `Content-Usage` / `Content-Signal` - Directives for expressing AI usage preferences (see [AI Directives](#ai-directives) below).

This parsed data will be shown for environments that are `indexable`.

## [AI Directives](#ai-directives)

AI Directives allow you to control how AI systems, search engines, and automated tools interact with your content. Several methods are supported:

- **Vendor-Specific Tokens** - Official opt-outs for major providers (`Google-Extended`, `Applebot-Extended`)
- **Content-Usage** - IETF standard (`bots`, `train-ai`, `ai-output`, `search`) with `y`/`n` values
- **Content-Signal** - Cloudflare implementation (`search`, `ai-input`, `ai-train`) with `yes`/`no` values

### [Quick Example](#quick-example)

robots.txt

```
User-agent: *
Allow: /
Content-Usage: bots=y, train-ai=n
Content-Signal: ai-train=no, search=yes
```

See the [**AI Directives Guide**](https://nuxtseo.com/docs/robots/guides/ai-directives) for complete documentation, examples, validation rules, and best practices.

[](https://nuxtseo.com/tools/robots-txt-generator)**Validate your output** - Test your robots.txt rules with our [Robots.txt Tester](https://nuxtseo.com/tools/robots-txt-generator).

## [Conflicting `public/robots.txt`](#conflicting-publicrobotstxt)

To ensure other modules can integrate with your generated robots file, you must not have a `robots.txt` file in your `public` folder.

If you do, it will be moved to `<rootDir>/public/_robots.txt` and merged with the generated file.

[Edit this page](https://github.com/nuxt-modules/robots/edit/main/docs/content/2.guides/1.robots-txt.md)

[Markdown For LLMs](https://nuxtseo.com/docs/robots/guides/robots-txt.md)

Did this page help you?

[Robots.txt Recipes Common robots.txt patterns including blocking bad bots, AI crawlers, and search results.](https://nuxtseo.com/docs/robots/guides/robot-recipes) [Config using Nuxt Config Learn how to configure the module programmatically using nuxt.config.](https://nuxtseo.com/docs/robots/guides/nuxt-config)

On this page

- [Introduction](#introduction)
- [Creating a robots.txt file](#creating-a-robotstxt-file)
- [Parsed robots.txt](#parsed-robotstxt)
- [AI Directives](#ai-directives)
- [Conflicting public/robots.txt](#conflicting-publicrobotstxt)

[GitHub](https://github.com/harlan-zw/nuxt-seo) [ Discord](https://discord.com/invite/275MBUBvgP)

### [NuxtSEO](https://nuxtseo.com/ "Home")

- [Getting Started](https://nuxtseo.com/docs/nuxt-seo/getting-started/introduction)
- [MCP](https://nuxtseo.com/docs/nuxt-seo/guides/mcp)

Modules

- [Robots](https://nuxtseo.com/docs/robots/getting-started/introduction)
- [Sitemap](https://nuxtseo.com/docs/sitemap/getting-started/introduction)
- [OG Image](https://nuxtseo.com/docs/og-image/getting-started/introduction)
- [Schema.org](https://nuxtseo.com/docs/schema-org/getting-started/introduction)
- [Link Checker](https://nuxtseo.com/docs/link-checker/getting-started/introduction)
- [SEO Utils](https://nuxtseo.com/docs/seo-utils/getting-started/introduction)
- [Site Config](https://nuxtseo.com/docs/site-config/getting-started/introduction)
- [Skew Protection](https://nuxtseo.com/docs/skew-protection/getting-started/introduction)
- [AI Ready](https://nuxtseo.com/docs/ai-ready/getting-started/introduction)

### [NuxtSEO Pro](https://nuxtseo.com/pro "Home")

- [Getting Started](https://nuxtseo.com/pro)
- [Dashboard](https://nuxtseo.com/pro/dashboard)
- [Pro MCP](https://nuxtseo.com/docs/nuxt-seo-pro/mcp/installation)

### [Learn SEO](https://nuxtseo.com/learn-seo "Learn SEO")

Nuxt

- [Mastering Meta](https://nuxtseo.com/learn-seo/nuxt/mastering-meta)
- [Controlling Crawlers](https://nuxtseo.com/learn-seo/nuxt/controlling-crawlers)
- [Launch & Listen](https://nuxtseo.com/learn-seo/nuxt/launch-and-listen)
- [Routes & Rendering](https://nuxtseo.com/learn-seo/nuxt/routes-and-rendering)
- [Staying Secure](https://nuxtseo.com/learn-seo/nuxt/routes-and-rendering/security)

Vue

- [Vue SEO Guide](https://nuxtseo.com/learn-seo/vue)
- [Mastering Meta](https://nuxtseo.com/learn-seo/vue/mastering-meta)
- [Controlling Crawlers](https://nuxtseo.com/learn-seo/vue/controlling-crawlers)
- [SPA SEO](https://nuxtseo.com/learn-seo/vue/spa)
- [SSR Frameworks](https://nuxtseo.com/learn-seo/vue/ssr-frameworks)
- [SEO Checklist](https://nuxtseo.com/learn-seo/checklist)
- [Pre-Launch Warmup](https://nuxtseo.com/learn-seo/pre-launch-warmup)
- [Backlinks & Authority](https://nuxtseo.com/learn-seo/backlinks)

### [Tools](https://nuxtseo.com/tools "SEO Tools")

- [Social Share Debugger](https://nuxtseo.com/tools/social-share-debugger)
- [Robots.txt Generator](https://nuxtseo.com/tools/robots-txt-generator)
- [Meta Tag Checker](https://nuxtseo.com/tools/meta-tag-checker)
- [HTML to Markdown](https://nuxtseo.com/tools/html-to-markdown)
- [XML Sitemap Validator](https://nuxtseo.com/tools/xml-sitemap-validator)
- [Schema.org Validator](https://nuxtseo.com/tools/schema-validator)
- [Keyword Research Pro](https://nuxtseo.com/tools/keyword-research)
- [SERP Analyzer Pro](https://nuxtseo.com/tools/serp-analyzer)
- [Domain Rankings Pro](https://nuxtseo.com/tools/domain-rankings)

Copyright © 2023-2026 Harlan Wilton - [MIT License](https://github.com/harlan-zw/nuxt-seo/blob/main/license) · [mdream](https://mdream.dev)