Robots
Getting Started

Nuxt Simple Robots Features

Get started with nuxt-simple-robots by installing the dependency to your project.

Nuxt Simple Robots manages the robots crawling your site with minimal config and best practice defaults.

๐Ÿค– Robots.txt Config

Configuring the rules is as simple as adding a production robots.txt file to your project.

๐Ÿ—ฟ X-Robots-Tag Header, Meta Tag

Ensures pages that should not be indexed are not indexed with the following:

  • X-Robots-Tag header
  • <meta name="robots" ...> meta tag

Both enabled by default.

๐Ÿ”’ Production only indexing

The module uses Nuxt Site Config to determine if the site is in production mode.

It will disables non-production environments from being indexed, avoiding duplicate content issues.

๐Ÿ”„ Easy and powerful configuration

Use route rules to easily target subsets of your site. When you need even more control, use the runtime Nitro hooks to dynamically configure your robots rules.

๐ŸŒŽ I18n Support

Will automatically fix any non-localised paths within your allow and disallow rules.