v4.0.0
Nuxt Simple Robots is now Nuxt Robots
In a discussion with the team and the community, we have decided to migrate
This will allow me to better maintain the module and provide a more consistent experience across the Nuxt ecosystem.
To upgrade simply replace the dependency in
- 'nuxt-simple-robots'
+ '@nuxtjs/robots'
If you're coming from
@nuxtjs/robots v3 breaking changes
- The
configPath config is no longer supported. For custom runtime config you should use Nitro Hooks. - The
rules config is deprecated but will continue to work. AnyBlankLine orComment rules will no longer work. - Using
CleanParam ,CrawlDelay andDisavow requires targeting the Yandex user agent.
Features
useRobotsRule()
A new Nuxt composable useRobotsRule() has been introduced to allow you to access and modify the current robots rule for the current route.
import { useRobotsRule } from '#imports'
const rule = useRobotsRule()
// Ref<'noindex, nofollow'>
Robots.txt validation
When loading in a
This will help you avoid errors from Google Search Console and Google Lighthouse.
Default Meta Tag
The module now adds the meta tag to your site by default. The composable and component helpers used to define this previously have been deprecated.
<!-- Example for an indexable route -->
<meta name="robots" content="index, follow">
Adding the meta tag is important for pages that are prerendered as the
You can opt out with
I18n Integration
The module now integrates with nuxt-i18n.
This will automatically re-configure your
export default defineNuxtConfig({
robots: {
allow: ['/about'],
disallow: ['/admin'],
},
i18n: {
strategy: 'prefix_except_default',
locales: ['en', 'fr'],
defaultLocale: 'en',
},
})
# robots.txt
User-agent: *
Allow: /about
Allow: /fr/about
Disallow: /admin
Disallow: /fr/admin
Learn more on the I18n Integration docs.
Nuxt Content Integration
The module now integrates with @nuxt/content. Allowing you to use the
---
robots: false
---
Learn more on the Nuxt Content docs.
Nuxt DevTools Integration
The module now integrates with Nuxt DevTools.
You can visit the Robots tab and see if the current route is indexable, and if not, why.
New Nitro Hook and Util Exports
In this version the new hook Nitro hook as introduced
Like-wise you can now re-use any of the internal functions to parse, validate and generate
robots.txt data using the
import { parseRobotsTxt } from '@nuxtjs/robots/util'
export default defineNitroPlugin((nitroApp) => {
nitroApp.hooks.hook('robots:config', async (ctx) => {
if (ctx.context === 'robots.txt') {
const customRobotsTxt = await $fetch('https://example.com/robots.txt')
const parsed = parseRobotsTxt(config)
config.groups = defu(config.groups, parsed.groups)
}
})
})
Breaking Changes
Site Config
The deprecated Nuxt Config site config keys have been removed:
You will need to configure these using Site Config.
export default defineNuxtConfig({
robots: {
- indexable: false,
},
site: {
+ indexable: false,
}
})
Deprecations
defineRobotMeta() and <RobotMeta>
Because the module now uses a default meta tag, the
You should remove this from your code.
index Route Rule
The
export default defineNuxtConfig({
routeRules: {
'/admin': {
- index: false,
+ robots: false,
}
}
})