Want to know why you need this module? Check out the introduction.
To get started with Nuxt Robots, you need to install the dependency and add it to your Nuxt config.
npx nuxt module add robotsnpm i @nuxtjs/robotsYou will need to manually add the module to your Nuxt config.
export default defineNuxtConfig({
modules: [
'@nuxtjs/robots',
],
})
yarn add @nuxtjs/robotsYou will need to manually add the module to your Nuxt config.
export default defineNuxtConfig({
modules: [
'@nuxtjs/robots',
],
})
pnpm i @nuxtjs/robotsYou will need to manually add the module to your Nuxt config.
export default defineNuxtConfig({
modules: [
'@nuxtjs/robots',
],
})
bun i @nuxtjs/robotsYou will need to manually add the module to your Nuxt config.
export default defineNuxtConfig({
modules: [
'@nuxtjs/robots',
],
})
To ensure the module is behaving as expected, you should first check /robots.txt is being generated.
It should show that the site is disallowed from indexing, this is good as development environments should not be indexed by search engines.
However, we want to see what a production environment would look like.
For this, it's recommended to use the Nuxt DevTools Robots tab to see the current configuration and how it's being applied.
The DevTools will show you that in production we're just serving a minimal robots.txt file.
User-agent: *
Disallow:
This allows all search engines to index the site.
Every site is different and will require their own unique configuration, to give you a head start you may consider the following areas to configure.
Make sure you understand the differences between robots.txt vs robots meta tag with the Controlling Web Crawlers guide.
You've successfully installed Nuxt Robots and configured it for your project.
Documentation is provided for module integrations, check them out if you're using them.
Next check out the robots.txt recipes guide for some inspiration.