How Nuxt Robots Works · Nuxt Robots · Nuxt SEO

[NuxtSEO](https://nuxtseo.com/ "Home")

- [Modules](https://nuxtseo.com/docs/nuxt-seo/getting-started/introduction)
- [Tools](https://nuxtseo.com/tools)
- [Pro](https://nuxtseo.com/pro)
- [Learn SEO](https://nuxtseo.com/learn-seo/nuxt) [Releases](https://nuxtseo.com/releases)

[1.4K](https://github.com/harlan-zw/nuxt-seo)

[Nuxt SEO on GitHub](https://github.com/harlan-zw/nuxt-seo)

[User Guides](https://nuxtseo.com/docs/robots/getting-started/introduction)

[API](https://nuxtseo.com/docs/robots/api/use-robots-rule)

[Releases](https://nuxtseo.com/docs/robots/releases/v6)

Robots

- [Switch to Robots](https://nuxtseo.com/docs/robots/getting-started/introduction)
- [Switch to Nuxt SEO](https://nuxtseo.com/docs/nuxt-seo/getting-started/introduction)
- [Switch to Sitemap](https://nuxtseo.com/docs/sitemap/getting-started/introduction)
- [Switch to OG Image](https://nuxtseo.com/docs/og-image/getting-started/introduction)
- [Switch to Schema.org](https://nuxtseo.com/docs/schema-org/getting-started/introduction)
- [Switch to Link Checker](https://nuxtseo.com/docs/link-checker/getting-started/introduction)
- [Switch to SEO Utils](https://nuxtseo.com/docs/seo-utils/getting-started/introduction)
- [Switch to Site Config](https://nuxtseo.com/docs/site-config/getting-started/introduction)
- [Switch to Skew Protection](https://nuxtseo.com/docs/skew-protection/getting-started/introduction)
- [Switch to AI Ready](https://nuxtseo.com/docs/ai-ready/getting-started/introduction)

Search…```k`` /`

v6.0.7

- Playgrounds
- [Discord Support](https://discord.com/invite/275MBUBvgP)

### Getting Started

- [Introduction](https://nuxtseo.com/docs/robots/getting-started/introduction)
- [Installation](https://nuxtseo.com/docs/robots/getting-started/installation)
- [Troubleshooting](https://nuxtseo.com/docs/robots/getting-started/troubleshooting)

### Core Concepts

- [Disabling Site Indexing](https://nuxtseo.com/docs/robots/guides/disable-indexing)
- [Disable Page Indexing](https://nuxtseo.com/docs/robots/guides/disable-page-indexing)
- [How Nuxt Robots Works](https://nuxtseo.com/docs/robots/guides/how-it-works)
- [Robots.txt Recipes](https://nuxtseo.com/docs/robots/guides/robot-recipes)
- [Config using Robots.txt](https://nuxtseo.com/docs/robots/guides/robots-txt)
- [Config using Nuxt Config](https://nuxtseo.com/docs/robots/guides/nuxt-config)
- [Config Using Route Rules](https://nuxtseo.com/docs/robots/guides/route-rules)
- [Bot Detection](https://nuxtseo.com/docs/robots/guides/bot-detection)
- [AI Directives](https://nuxtseo.com/docs/robots/guides/ai-directives)

### Advanced

- [Yandex: Clean-param](https://nuxtseo.com/docs/robots/advanced/yandex)
- [Nuxt Content](https://nuxtseo.com/docs/robots/advanced/content)
- [Nuxt I18n](https://nuxtseo.com/docs/robots/advanced/i18n)

Core Concepts

# How Nuxt Robots Works

[Copy for LLMs](https://nuxtseo.com/docs/robots/guides/how-it-works.md)

Nuxt Robots tells robots (crawlers) how to behave by creating a `robots.txt` file for you, adding a `X-Robots-Tag` header and `<meta name="robots">` tag to your site where appropriate.

One important behaviour to control is blocking Google from indexing pages to:

- Prevent [duplicate content issues](https://moz.com/learn/seo/duplicate-content)
- Prevent wasting [crawl budget](https://developers.google.com/search/docs/crawling-indexing/large-site-managing-crawl-budget)

## [Robots.txt](#robotstxt)

For robots to understand how they can access your site, they will first check for a [robots.txt file](https://nuxtseo.com/docs/robots/guides/robots-txt).

```
public
 └── robots.txt
```

This file is generated differently depending on the environment:

- When deploying using `nuxi generate` or the `nitro.prerender.routes` rule, this is a static file.
- Otherwise, it's handled by the server and generated at runtime when requested.

When indexing is disabled a `robots.txt` will be generated with the following content:

robots.txt

```
User-agent: *
Disallow: /
```

This blocks all bots from indexing your site.

## [`X-Robots-Tag` Header and `<meta name="robots">`](#x-robots-tag-header-and-meta-namerobots)

In some situations, the robots.txt becomes too restrictive to provide the level of control you need to manage your site's indexing.

For this reason, the module by default will provide a `X-Robots-Tag` header and `<meta name="robots">` tag.

These are applied using the following logic:

- `X-Robots-Tag` header - Route Rules are implemented for all modes, otherwise SSR only. This header is added for all pages: set to the enabled value (`robotsEnabledValue`) for indexable pages and the disabled value (`robotsDisabledValue`) for non-indexable pages.
- `<meta name="robots">` - SSR only, will always be added

## [Robot Rules](#robot-rules)

Default values for the `robots` rule depending on the mode.

For indexable routes the following is used:

```
<meta name="robots" content="index, follow, max-image-preview:large, max-snippet:-1, max-video-preview:-1">
```

Besides giving robots the go-ahead, this also requests that Google:

> Choose the snippet length that it believes is most effective to help users discover your content and direct users to your site."

You can learn more on the [Robots Meta Tag](https://developers.google.com/search/docs/crawling-indexing/robots-meta-tag) documentation, feel free to change this to suit your needs using `robotsEnabledValue`.

For non-indexable routes the following is used:

```
<meta name="robots" content="noindex, nofollow">
```

This will tell robots to not index the page.

## [Development Environment](#development-environment)

The module by default will disable indexing in development environments. This is for safety, as you don't want your development environment to be indexed by search engines.

robots.txt

```
# Block all bots
User-agent: *
Disallow: /
```

## [Production Environments](#production-environments)

For production environments, the module will generate a `robots.txt` file that allows all bots.

Out-of-the-box, this will be the following:

robots.txt

```
User-agent: *
Disallow:
```

This tells all bots that they can index your entire site.

[](https://nuxtseo.com/tools/robots-txt-generator)**Test your rules** - Validate your robots.txt with our [Robots.txt Generator & Tester](https://nuxtseo.com/tools/robots-txt-generator).

[Edit this page](https://github.com/nuxt-modules/robots/edit/main/docs/content/2.guides/1.how-it-works.md)

[Markdown For LLMs](https://nuxtseo.com/docs/robots/guides/how-it-works.md)

Did this page help you?

[Disable Page Indexing Learn how to disable indexing for specific pages on your app.](https://nuxtseo.com/docs/robots/guides/disable-page-indexing) [Robots.txt Recipes Common robots.txt patterns including blocking bad bots, AI crawlers, and search results.](https://nuxtseo.com/docs/robots/guides/robot-recipes)

On this page

- [Robots.txt](#robotstxt)
- [X-Robots-Tag Header and <meta name="robots">](#x-robots-tag-header-and-meta-namerobots)
- [Robot Rules](#robot-rules)
- [Development Environment](#development-environment)
- [Production Environments](#production-environments)

[GitHub](https://github.com/harlan-zw/nuxt-seo) [ Discord](https://discord.com/invite/275MBUBvgP)

### [NuxtSEO](https://nuxtseo.com/ "Home")

- [Getting Started](https://nuxtseo.com/docs/nuxt-seo/getting-started/introduction)
- [MCP](https://nuxtseo.com/docs/nuxt-seo/guides/mcp)

Modules

- [Robots](https://nuxtseo.com/docs/robots/getting-started/introduction)
- [Sitemap](https://nuxtseo.com/docs/sitemap/getting-started/introduction)
- [OG Image](https://nuxtseo.com/docs/og-image/getting-started/introduction)
- [Schema.org](https://nuxtseo.com/docs/schema-org/getting-started/introduction)
- [Link Checker](https://nuxtseo.com/docs/link-checker/getting-started/introduction)
- [SEO Utils](https://nuxtseo.com/docs/seo-utils/getting-started/introduction)
- [Site Config](https://nuxtseo.com/docs/site-config/getting-started/introduction)
- [Skew Protection](https://nuxtseo.com/docs/skew-protection/getting-started/introduction)
- [AI Ready](https://nuxtseo.com/docs/ai-ready/getting-started/introduction)

### [NuxtSEO Pro](https://nuxtseo.com/pro "Home")

- [Getting Started](https://nuxtseo.com/pro)
- [Dashboard](https://nuxtseo.com/pro/dashboard)
- [Pro MCP](https://nuxtseo.com/docs/nuxt-seo-pro/mcp/installation)

### [Learn SEO](https://nuxtseo.com/learn-seo "Learn SEO")

Nuxt

- [Mastering Meta](https://nuxtseo.com/learn-seo/nuxt/mastering-meta)
- [Controlling Crawlers](https://nuxtseo.com/learn-seo/nuxt/controlling-crawlers)
- [Launch & Listen](https://nuxtseo.com/learn-seo/nuxt/launch-and-listen)
- [Routes & Rendering](https://nuxtseo.com/learn-seo/nuxt/routes-and-rendering)
- [Staying Secure](https://nuxtseo.com/learn-seo/nuxt/routes-and-rendering/security)

Vue

- [Vue SEO Guide](https://nuxtseo.com/learn-seo/vue)
- [Mastering Meta](https://nuxtseo.com/learn-seo/vue/mastering-meta)
- [Controlling Crawlers](https://nuxtseo.com/learn-seo/vue/controlling-crawlers)
- [SPA SEO](https://nuxtseo.com/learn-seo/vue/spa)
- [SSR Frameworks](https://nuxtseo.com/learn-seo/vue/ssr-frameworks)
- [SEO Checklist](https://nuxtseo.com/learn-seo/checklist)
- [Pre-Launch Warmup](https://nuxtseo.com/learn-seo/pre-launch-warmup)
- [Backlinks & Authority](https://nuxtseo.com/learn-seo/backlinks)

### [Tools](https://nuxtseo.com/tools "SEO Tools")

- [Social Share Debugger](https://nuxtseo.com/tools/social-share-debugger)
- [Robots.txt Generator](https://nuxtseo.com/tools/robots-txt-generator)
- [Meta Tag Checker](https://nuxtseo.com/tools/meta-tag-checker)
- [HTML to Markdown](https://nuxtseo.com/tools/html-to-markdown)
- [XML Sitemap Validator](https://nuxtseo.com/tools/xml-sitemap-validator)
- [Schema.org Validator](https://nuxtseo.com/tools/schema-validator)
- [Keyword Idea Generator](https://nuxtseo.com/tools/keyword-generator)
- [Keyword Research](https://nuxtseo.com/tools/keyword-research)
- [SERP Analyzer](https://nuxtseo.com/tools/serp-analyzer)
- [Domain Rankings](https://nuxtseo.com/tools/domain-rankings)

Copyright © 2023-2026 Harlan Wilton - [MIT License](https://github.com/harlan-zw/nuxt-seo/blob/main/license) · [mdream](https://mdream.dev)