---
title: "v6.0.0"
description: "Release notes for Nuxt Robots v6.0.0."
canonical_url: "https://nuxtseo.com/docs/robots/releases/v6"
last_updated: "2026-05-06T18:45:58.380Z"
---

## Introduction

The v6 major of Nuxt Robots adds new content composables, improves validation with warnings support, and includes several important bug fixes.

## ⚠️ Breaking Changes

### Site Config v4

Nuxt Site Config is a module used internally by Nuxt Robots.

Its major update to v4.0.0 shouldn't have any direct effect on your site, however, you may want to double check
the [breaking changes](https://github.com/harlan-zw/nuxt-site-config/releases/tag/v4.0.0).

### `robots:config` Hook Context

The `HookRobotsConfigContext` now includes a `warnings: string[]` field alongside the existing `errors: string[]`. If you use the `robots:config` Nitro hook and inspect the context, be aware of this new field.

## 🚀 Features

### `defineRobotsSchema()` Composable

A new composable for `@nuxt/content` v3 that simplifies adding robots fields to your content schema.

```ts [content.config.ts]
import { defineCollection, defineContentConfig } from '@nuxt/content'
import { defineRobotsSchema } from '@nuxtjs/robots/content'
import { z } from 'zod'

export default defineContentConfig({
  collections: {
    pages: defineCollection({
      type: 'page',
      source: '**/*.md',
      schema: z.object({
        robots: defineRobotsSchema(),
      }),
    }),
  },
})
```

This replaces the previous `asRobotsCollection()` helper, which is now deprecated.

### Validation Warnings

The robots.txt validation system now supports warnings in addition to errors. The first warning checks for `Disallow: /api` rules, which may unintentionally block API routes that need to be accessible.

Warnings appear in the devtools debug view alongside errors, helping you catch potential misconfigurations before they cause issues.

### Production Debug Route

A new `/__robots__/debug-production.json` server route is available in development. It fetches your production site's `robots.txt`, validates it, and returns a structured response with errors, warnings, parsed groups, and sitemaps. This makes it easy to compare your local configuration against what's live in production.

## 🔧 Bug Fixes

### `skipSiteIndexable` Now Skips `Disallow: /`

Previously, setting `skipSiteIndexable: true` (used by sitemap generation) only skipped the site config indexable check. It now also filters out `Disallow: /` root disallow rules from path matching, ensuring sitemap URLs are correctly generated on staging or non-indexable environments. Specific path rules like `/admin` still apply as expected.

### Route Rules Nullish Guard

Route rules with `undefined` or `null` values no longer cause runtime errors. The `normaliseRobotsRouteRule` function now safely handles nullish input.

### Devtools Meta Tag Parsing

Meta tag parsing in the devtools debug view is now attribute order agnostic, fixing cases where `<meta content="..." name="robots">` was not detected.
