Robots
Releases

v4.0.0

Release notes for Nuxt Simple Robots v4.0.0.

Features

Robots.txt validation

When loading in a robots.txt file, the module will now validate the file to ensure each of the disallow and allow paths are valid.

This will help you avoid errors from Google Search Console and Google Lighthouse.

Default Meta Tag

The module now adds the meta tag to your site by default. The composable and component helpers used to define this previously have been deprecated.

<!-- Example for an indexable route -->
<meta name="robots" content="index, follow">

Adding the meta tag is important for pages that are prerendered as the X-Robots-Tag header is not available.

You can opt out with metaTags: false.

I18n Integration

The module now integrates with nuxt-i18n.

This will automatically re-configure your allow and disallow rules to include the locale prefix if you have omitted it.

export default defineNuxtConfig({
  robots: {
    allow: ['/about'],
    disallow: ['/admin'],
  },
  i18n: {
    strategy: 'prefix_except_default',
    locales: ['en', 'fr'],
    defaultLocale: 'en',
  },
})
# robots.txt
User-agent: *
Allow: /about
Allow: /fr/about
Disallow: /admin
Disallow: /fr/admin

Learn more on the I18n Integration docs.

Nuxt Content Integration

The module now integrates with @nuxt/content. Allowing you to use the robots frontmatter key within your markdown files.

---
robots: false
---

Learn more on the Nuxt Content docs.

Nuxt DevTools Integration

The module now integrates with Nuxt DevTools.

You can visit the Robots tab and see if the current route is indexable, and if not, why.

New Nitro Hook and Util Exports

In this version the new hook Nitro hook as introduced robots:config. This hook will let you override the robots.txt data as a JavaScript object, instead of a string.

Like-wise you can now re-use any of the internal functions to parse, validate and generate robots.txt data using the nuxt-simple-robots/util export.

import { parseRobotsTxt } from 'nuxt-simple-robots/util'

export default defineNitroPlugin((nitroApp) => {
  nitroApp.hooks.hook('robots:config', async (ctx) => {
    if (ctx.context === 'robots.txt') {
      const customRobotsTxt = await $fetch('https://example.com/robots.txt')
      const parsed = parseRobotsTxt(config)
      config.groups = defu(config.groups, parsed.groups)
    }
  })
})

Breaking Changes

Site Config

The deprecated Nuxt Config site config keys have been removed: host, siteUrl, indexable.

You will need to configure these using Site Config.

export default defineNuxtConfig({
  robots: {
-    indexable: false,
  },
  site: {
+   indexable: false,
  }
})

Deprecations

defineRobotMeta() and <RobotMeta>

Because the module now uses a default meta tag, the defineRobotMeta() function and <RobotMeta> component are deprecated.

You should remove this from your code.

index Route Rule

The index route rule has been deprecated in favor of the robots rule. This provides less ambiguity and more control over the rule.

export default defineNuxtConfig({
  routeRules: {
    '/admin': {
-      index: false,
+      robots: false,
    }
  }
})