Robots
Guides

Disabling Site Indexing

Learn how to disable indexing for different environments and conditions to avoid crawling issues.

There are a number of reasons why you may want to disable indexing of your site.

I've outlined some of the most common scenarios below.

Note, if you're just trying to disable specific routes from being indexed, you should use the robots.txt config instead.

Hidden Production

In the case that you have a production site that you don't want search engines to know about, you should explicitly set the indexable option to false.

This would be common for internal business tools, or sites that are not ready for public consumption.

export default defineNuxtConfig({
  site: { indexable: false }
})

Preview / Staging / Testing Environments

These environments are great for testing out code before it goes to production. But we definitely don't want search engines to index them.

You want these hidden from search engines. You will need to provide a site env for each of them.

This is only needed for non-production environments, as the default value is production when your Nuxt app is built.

Because these will change depending on your environment, you should use the NUXT_SITE_ENV environment variable to set them.

NUXT_SITE_ENV=preview

Feel free to set it to whatever you want, as long as it's not production.

Alternative Production URLs

In the case where you have multiple domains or subdomains which point to your canonical URL, you only want one version of them to be indexed.

For example, www.example.com and example.com, we only want to index example.com

In this case, you should permanently redirect instead of opt-ing out of indexing.

If you're using the Nuxt SEO Module you can use the redirectToCanonicalSiteUrl option.

Multi-tenancy Production Site

This is a bit more complicated but this is where Nuxt Site Config shines.

See the Runtime Site Config guide to learn how to set site config at runtime.