Guides
Config Using Route Rules
Last updated by
Harlan Wilton
in chore: broken tests. If you prefer, you can use route rules to configure how your routes are indexed by search engines.
You can provide the following rules:
{ robots: false }
- Will disable the route from being indexed using the robotsDisabledValue config.{ robots: '<rule>' }
- Will add the provided string as the robots rule{ robots: { /* directives */ } }
- Will use object syntax to define robot directives
The rules are applied using the following logic:
X-Robots-Tag
header - SSR only,<meta name="robots">
/robots.txt
disallow entry - When disallowNonIndexableRoutes is enabled
Inline Route Rules
Requires enabling the experimental inlineRouteRules
, see the defineRouteRules documentation
to learn more.
<script lang="ts" setup>
defineRouteRules({
robots: false,
})
</script>
Nuxt Config
nuxt.config.ts
export default defineNuxtConfig({
routeRules: {
// use the `index` shortcut for simple rules
'/secret/**': { robots: false },
// add exceptions for individual routes
'/secret/visible': { robots: true },
// use the `robots` rule if you need finer control
'/custom-robots': { robots: 'index, follow' },
// use object syntax for more complex rules
'/ai-protected': {
robots: {
index: true,
noai: true,
noimageai: true
}
},
// control search result previews
'/limited-preview': {
robots: {
'index': true,
'max-image-preview': 'standard',
'max-snippet': 100,
'max-video-preview': 15
}
}
}
})
Did this page help you?