v4.0.0
Nuxt Simple Robots is now Nuxt Robots
In a discussion with the team and the community, we have decided to migrate nuxt-simple-robots
into the @nuxtjs/robots
module.
This will allow me to better maintain the module and provide a more consistent experience across the Nuxt ecosystem.
To upgrade simply replace the dependency in package.json
and update your nuxt.config.
- 'nuxt-simple-robots'
+ '@nuxtjs/robots'
If you're coming from nuxt-simple-robots
then no other changes are needed. If you're coming from @nuxtjs/robots
v3, then
the following breaking changes exist.
@nuxtjs/robots
v3 breaking changes
- The
configPath
config is no longer supported. For custom runtime config you should use Nitro Hooks. - The
rules
config is deprecated but will continue to work. AnyBlankLine
orComment
rules will no longer work. - Using
CleanParam
,CrawlDelay
andDisavow
requires targeting the Yandex user agent.
Features
useRobotsRule()
A new Nuxt composable useRobotsRule() has been introduced to allow you to access and modify the current robots rule for the current route.
import { useRobotsRule } from '#imports'
const rule = useRobotsRule()
// Ref<'noindex, nofollow'>
Robots.txt validation
When loading in a robots.txt
file, the module will now validate the file to ensure each of the disallow
and allow
paths are valid.
This will help you avoid errors from Google Search Console and Google Lighthouse.
Default Meta Tag
The module now adds the meta tag to your site by default. The composable and component helpers used to define this previously have been deprecated.
<!-- Example for an indexable route -->
<meta name="robots" content="index, follow">
Adding the meta tag is important for pages that are prerendered as the X-Robots-Tag
header is not available.
You can opt out with metaTags: false.
I18n Integration
The module now integrates with nuxt-i18n.
This will automatically re-configure your allow
and disallow
rules to include the locale prefix if you have
omitted it.
export default defineNuxtConfig({
robots: {
allow: ['/about'],
disallow: ['/admin'],
},
i18n: {
strategy: 'prefix_except_default',
locales: ['en', 'fr'],
defaultLocale: 'en',
},
})
# robots.txt
User-agent: *
Allow: /about
Allow: /fr/about
Disallow: /admin
Disallow: /fr/admin
Learn more on the I18n Integration docs.
Nuxt Content Integration
The module now integrates with @nuxt/content. Allowing you to use the robots
frontmatter key within your markdown files.
---
robots: false
---
Learn more on the Nuxt Content docs.
Nuxt DevTools Integration
The module now integrates with Nuxt DevTools.
You can visit the Robots tab and see if the current route is indexable, and if not, why.
New Nitro Hook and Util Exports
In this version the new hook Nitro hook as introduced robots:config
. This hook
will let you override the robots.txt data as a JavaScript object, instead of a string.
Like-wise you can now re-use any of the internal functions to parse, validate and generate
robots.txt data using the @nuxtjs/robots/util
export.
import { parseRobotsTxt } from '@nuxtjs/robots/util'
export default defineNitroPlugin((nitroApp) => {
nitroApp.hooks.hook('robots:config', async (ctx) => {
if (ctx.context === 'robots.txt') {
const customRobotsTxt = await $fetch('https://example.com/robots.txt')
const parsed = parseRobotsTxt(config)
config.groups = defu(config.groups, parsed.groups)
}
})
})
Breaking Changes
Site Config
The deprecated Nuxt Config site config keys have been removed: host
, siteUrl
, indexable
.
You will need to configure these using Site Config.
export default defineNuxtConfig({
robots: {
- indexable: false,
},
site: {
+ indexable: false,
}
})
Deprecations
defineRobotMeta()
and <RobotMeta>
Because the module now uses a default meta tag, the defineRobotMeta()
function and <RobotMeta>
component are deprecated.
You should remove this from your code.
index
Route Rule
The index
route rule has been deprecated in favour of the robots
rule. This provides
less ambiguity and more control over the rule.
export default defineNuxtConfig({
routeRules: {
'/admin': {
- index: false,
+ robots: false,
}
}
})