Why use Nuxt Robots?

Nuxt Robots is a module for configuring the robots crawling your site with minimal config and best practice defaults.

The core feature of the module is:

  • Telling crawlers which paths they can and cannot access using a robots.txt file.
  • Telling search engine crawlers what they can show in search results from your site using a <meta name="robots" content="index"> X-Robots-Tag HTTP header.

New to robots or SEO? Check out the Controlling Web Crawlers guide to learn more about why you might need these features.

While it's simple to create your own robots.txt file, the module makes sure your non-production environments get disabled from indexing. This is important to avoid duplicate content issues and to avoid search engines serving your development or staging content to users.

The module also acts as an integration point for other modules. For example:

  • Nuxt Sitemap ensures pages you've marked as disallowed from indexing are excluded from the sitemap.
  • Nuxt Schema.org skips rendering Schema.org data if the page is marked as excluded from indexing.

Ready to get started? Check out the installation guide.

Features

Nuxt Robots manages the robots crawling your site with minimal config and best practice defaults.

๐Ÿค– Robots.txt Config

Configuring the rules is as simple as adding a production robots.txt file to your project.

๐Ÿ—ฟ X-Robots-Tag Header, Meta Tag

Ensures pages that should not be indexed are not indexed with the following:

  • X-Robots-Tag header
  • <meta name="robots" ...> meta tag

Both enabled by default.

๐Ÿ”’ Production only indexing

The module uses Nuxt Site Config to determine if the site is in production mode.

It will disables non-production environments from being indexed, avoiding duplicate content issues.

๐Ÿ”„ Easy and powerful configuration

Use route rules to easily target subsets of your site. When you need even more control, use the runtime Nitro hooks to dynamically configure your robots rules.

๐ŸŒŽ I18n Support

Will automatically fix any non-localised paths within your allow and disallow rules.

Did this page help you?