Nuxt Simple Robots Features
Get started with nuxt-simple-robots by installing the dependency to your project.
Nuxt Simple Robots manages the robots crawling your site with minimal config and best practice defaults.
Configuring the rules is as simple as adding a production robots.txt file to your project.
Ensures pages that should not be indexed are not indexed with the following:
<meta name="robots" ...>meta tag
Both enabled by default.
The module uses Nuxt Site Config to determine if the site is in production mode.
It will disables non-production environments from being indexed, avoiding duplicate content issues.
Use route rules to easily target subsets of your site. When you need even more control, use the runtime Nitro hooks to dynamically configure your robots rules.
Will automatically fix any non-localised paths within your