Introduction
The v5 major of Nuxt Robots is a simple release to remove deprecations and add support for the Nuxt SEO v2 stable.
Breaking Features
Site Config v3
Nuxt Site Config is a module used internally by Nuxt Robots.
It's major update to v3.0.0 shouldn't have any direct affect your site, however, you may want to double-check the breaking changes.
rules config Removed
The v4 of Nuxt Robots provided a backward compatibility
export default defineNuxtConfig({
robots: {
- rules: {},
+ groups: {}
}
})
defineRobotMeta composable Removed
This composable didn't do anything in v4 as the robots meta tag is enabled by default. If you'd like to control the robot meta tag rule, use the
- defineRobotMeta(true)
+ useRobotsRule(true)
RobotMeta component Removed
This component was a simple wrapper for
index , indexable config Removed
When configuring robots using route rules or Nuxt Content you could control the robot's behavior by providing
These are no longer supported and you should use
export default defineNuxtConfig({
routeRules: {
// use the `index` shortcut for simple rules
- '/secret/**': { index: false },
+ '/secret/**': { robots: false },
}
})
Features
blockAiBots Config
AI crawlers can be beneficial as they can help users finding your site, but for some educational sites or those not
interested in being indexed by AI crawlers, you can block them using the
export default defineNuxtConfig({
robots: {
blockAiBots: true
}
})
This will block the following AI crawlers: