Guides
Config Using Route Rules
Last updated by
Harlan Wilton
in fix!: drop all deprecations. If you prefer, you can use route rules to configure how your routes are indexed by search engines.
You can provide the following rules:
{ robots: false }
- Will disable the route from being indexed using the robotsDisabledValue config.{ robots: '<rule>' }
- Will add the provided string as the robots rule
The rules are applied using the following logic:
X-Robots-Tag
header - SSR only,<meta name="robots">
/robots.txt
disallow entry - When disallowNonIndexableRoutes is enabled
Inline Route Rules
Requires enabling the experimental inlineRouteRules
, see the defineRouteRules documentation
to learn more.
<script lang="ts" setup>
defineRouteRules({
robots: false,
})
</script>
Nuxt Config
nuxt.config.ts
export default defineNuxtConfig({
routeRules: {
// use the `index` shortcut for simple rules
'/secret/**': { robots: false },
// add exceptions for individual routes
'/secret/visible': { robots: true },
// use the `robots` rule if you need finer control
'/custom-robots': { robots: 'index, follow' },
}
})
Did this page help you?