Introduction
The v5 major of Nuxt Robots is a simple release to remove deprecations and add support for the Nuxt SEO v2 stable.
Breaking Features
Site Config v3
Nuxt Site Config is a module used internally by Nuxt Robots.
It's major update to v3.0.0 shouldn't have any direct affect your site, however, you may want to double-check the breaking changes.
rules
config Removed
The v4 of Nuxt Robots provided a backward compatibility rules
config. As it was deprecated, this is no longer supported. If you're using rules
, you should migrate to the groups
config or use a robots.txt file.
export default defineNuxtConfig({
robots: {
- rules: {},
+ groups: {}
}
})
defineRobotMeta
composable Removed
This composable didn't do anything in v4 as the robots meta tag is enabled by default. If you'd like to control the robot meta tag rule, use the useRobotsRule()
composable.
- defineRobotMeta(true)
+ useRobotsRule(true)
RobotMeta
component Removed
This component was a simple wrapper for defineRobotMeta
, you should use useRobotsRule()
if you wish to control the robots rule.
index
, indexable
config Removed
When configuring robots using route rules or Nuxt Content you could control the robot's behavior by providing index
or indexable
rules.
These are no longer supported and you should use robots
key.
export default defineNuxtConfig({
routeRules: {
// use the `index` shortcut for simple rules
- '/secret/**': { index: false },
+ '/secret/**': { robots: false },
}
})
Features
blockAiBots
Config
AI crawlers can be beneficial as they can help users finding your site, but for some educational sites or those not
interested in being indexed by AI crawlers, you can block them using the blockAIBots
option.
export default defineNuxtConfig({
robots: {
blockAiBots: true
}
})
This will block the following AI crawlers: GPTBot
, ChatGPT-User
, Claude-Web
, anthropic-ai
, Applebot-Extended
, Bytespider
, CCBot
, cohere-ai
, Diffbot
, FacebookBot
, Google-Extended
, ImagesiftBot
, PerplexityBot
, OmigiliBot
, Omigili