Debug Indexing Issues in Google Search Console

Fix "crawled currently not indexed" and other GSC coverage errors affecting your Vue site.
Harlan WiltonHarlan Wilton10 mins read Published
What you'll learn
  • "Crawled - currently not indexed" usually means low quality, duplicate, or thin content
  • SPAs need SSR or prerendering for reliable indexing—Google may not execute JavaScript properly
  • Internal links signal importance—orphan pages rarely get indexed

Google crawled your page but won't index it. This happens to millions of pages daily. The Page Indexing report in Google Search Console shows exactly which pages have issues and why.

Understanding Page Indexing Status

Google Search Console's Page Indexing report shows six main statuses:

Good Statuses

Indexed: Page appears in Google's search index. This is what you want.

Warning Statuses

Discovered - currently not indexed: Google found your page but hasn't crawled it yet. The page sits in Google's queue. This is normal for new sites and low-priority pages.

Crawled - currently not indexed: Google crawled your page but chose not to index it. This means Google decided your content isn't worth showing in search results.

Excluded Statuses

Excluded by robots.txt: Your robots.txt file blocks Google from accessing the page. Check your robots.txt configuration.

Blocked by noindex tag: Page has a noindex meta tag or HTTP header. Remove it if you want the page indexed.

Duplicate without canonical: Google found multiple identical pages without proper canonical tags. Set canonical URLs.

Soft 404: Page returns a 200 status code but looks like a 404 error to Google. Fix by returning proper 404 status codes for missing content.

Fixing "Crawled - Currently Not Indexed"

This status means Google crawled your page but decided it wasn't worth indexing. Google doesn't explicitly state why, but five main causes exist.

Thin or Low-Quality Content

Google skips pages with little unique value. Product pages with only titles and prices, blog posts under 300 words, and auto-generated content typically get excluded.

Fix: Add substantial content. Write detailed product descriptions, expand short blog posts to 800+ words, include images and videos, answer user questions comprehensively.

Duplicate Content

Multiple pages with identical or near-identical content waste Google's crawl budget. Common culprits: paginated URLs without proper canonicals, URL parameters creating duplicate versions, tag/category archives with the same posts.

Fix: Implement canonical tags pointing to the primary version. Consolidate similar pages. Use rel="canonical" in your Vue head:

<script setup>
import { useHead } from '@unhead/vue'

useHead({
  link: [
    { rel: 'canonical', href: 'https://yoursite.com/primary-page' }
  ]
})
</script>

Too Many Similar Pages

Sites with thousands of nearly-identical pages (e.g., faceted search, filter combinations) trigger quality filters. Google picks representative pages and excludes the rest.

Fix: Use meta robots noindex on filter pages, parameter-based URLs, and search result pages. Consolidate similar content into fewer comprehensive pages.

<script setup>
import { useHead } from '@unhead/vue'
import { useRoute } from 'vue-router'

const route = useRoute()

// Noindex pages with filter parameters
const shouldNoIndex = computed(() =>
  route.query.color || route.query.size || route.query.sort
)

useHead({
  meta: [
    { name: 'robots', content: shouldNoIndex.value ? 'noindex,follow' : 'index,follow' }
  ]
})
</script>

Low Site Authority

New sites with few backlinks face stricter indexing thresholds. Google prioritizes crawling trusted sites.

Fix: Build backlinks through guest posting, create linkable assets (tools, research, guides), get listed in industry directories, promote content on social media. This takes months—be patient.

Orphan Pages

Pages without internal links from other pages on your site signal low importance to Google. Orphan pages lack link equity and often don't get indexed.

Fix: Add internal links from relevant pages. Include new pages in your main navigation or sidebar. Link from high-authority pages on your site.

<!-- Link to important pages from your main layout -->
<template>
  <nav>
    <RouterLink to="/important-page">
      Important Page
    </RouterLink>
  </nav>
</template>

Fixing "Discovered - Currently Not Indexed"

This status means Google knows your page exists but hasn't crawled it. Three main causes exist.

Site Too New

Google takes weeks to crawl new sites. For brand-new domains, expect 2-4 weeks before regular crawling starts.

Fix: Wait. Submit your sitemap. Request indexing for critical pages via URL Inspection tool. Keep publishing content regularly.

Crawl Budget Issues

Large sites (10,000+ pages) run into Google's crawl budget limits. Google won't crawl everything if your site has slow server responses, too many low-quality pages, or complex URL structures.

Fix: Optimize server response times (target under 200ms). Remove or noindex low-value pages. Fix redirect chains. Reduce duplicate content. Block unnecessary URLs in robots.txt:

# robots.txt
User-agent: *
# Allow important pages
Allow: /

# Block low-value sections
Disallow: /admin/
Disallow: /search?
Disallow: /*?filter=
Disallow: /print-version/

Slow Server Response

If your server takes over 500ms to respond, Google may crawl fewer pages.

Fix: Enable caching, use a CDN, optimize database queries, upgrade hosting. Monitor server response times in Search Console's Crawl Stats report.

Pages Only in Sitemap

If pages exist only in your sitemap without internal links, Google considers them low priority.

Fix: Add internal links. Don't rely solely on sitemaps for discovery. Internal linking signals importance.

Vue SPA-Specific Issues

Single Page Applications create unique indexing challenges because content loads after the initial HTML renders.

JavaScript Rendering Problems

Google crawls your initial HTML first, then renders JavaScript in a second wave (days later). If critical content only appears after JavaScript execution, it may not get indexed immediately.

Test: View your page source (not DevTools). Right-click → View Page Source. If your content isn't visible in the raw HTML, Google's first crawl won't see it.

# Check if content is in initial HTML
curl -s https://yoursite.com/page | grep "expected content"

Fix: Use Server-Side Rendering (SSR) or Static Site Generation (SSG). For Vue SPAs without SSR, consider prerendering critical pages.

Content Loaded After Initial Render

Vue apps often fetch data after mounting. Google may not wait for all async operations to complete.

<!-- PROBLEM: Content loads after mount -->
<script setup>
import { onMounted, ref } from 'vue'

const products = ref([])

onMounted(async () => {
  // Google's crawler might not wait for this
  products.value = await fetch('/api/products').then(r => r.json())
})
</script>

<template>
  <div v-for="product in products" :key="product.id">
    {{ product.name }}
  </div>
</template>

Fix: Render content during SSR or prerender pages at build time:

<!-- SOLUTION: Fetch data before render -->
<script setup>
import { ref } from 'vue'

// Data available immediately
const products = ref(await fetch('/api/products').then(r => r.json()))
</script>

<template>
  <div v-for="product in products" :key="product.id">
    {{ product.name }}
  </div>
</template>

For client-only apps, show loading states with descriptive text that Google can index:

<template>
  <div>
    <h1>Product Catalog</h1>
    <p v-if="loading">
      Loading 500+ products from our catalog...
    </p>
    <div v-for="product in products" v-else :key="product.id">
      {{ product.name }}
    </div>
  </div>
</template>

Client-Side Routing Issues

Vue Router changes URLs without full page reloads. If your router isn't configured correctly, Google may not discover all routes.

Fix: Generate a complete sitemap listing all routes. Don't rely on Google following JavaScript-generated links:

// generate-sitemap.ts
import { routes } from './router'

const sitemap = routes.map(route => ({
  url: `https://yoursite.com${route.path}`,
  lastmod: new Date().toISOString()
}))

// Write to public/sitemap.xml

Testing JavaScript Rendering

Use Google Search Console's URL Inspection tool to see exactly what Google renders:

  1. Open Search Console → URL Inspection
  2. Enter your page URL
  3. Click "Test Live URL"
  4. Click "View Tested Page" → "Screenshot"

Compare the screenshot to your actual page. If content is missing, Google isn't rendering it properly.

Requesting Re-Indexing

After fixing issues, request re-indexing via URL Inspection:

  1. Search Console → URL Inspection
  2. Enter fixed URL
  3. Click "Request Indexing"

Google prioritizes these requests but doesn't guarantee indexing. It still evaluates content quality.

For many URLs, request indexing programmatically using the Google Indexing API:

// Request indexing via API
import { google } from 'googleapis'

async function requestIndexing(url: string) {
  const auth = await google.auth.getClient({
    scopes: ['https://www.googleapis.com/auth/indexing']
  })

  const indexing = google.indexing({ version: 'v3', auth })

  await indexing.urlNotifications.publish({
    requestBody: {
      url,
      type: 'URL_UPDATED'
    }
  })
}

Monitoring Progress

Track indexing status changes over time:

  1. Search Console → Page Indexing
  2. Check "Not indexed" count weekly
  3. Look for status changes from "Crawled - not indexed" to "Indexed"

Expect changes to take 1-4 weeks. Google doesn't index on demand—it re-evaluates pages on its schedule.

Using Nuxt?

Nuxt handles SSR and prerendering automatically, avoiding most SPA indexing issues. See Nuxt indexing guide for framework-specific solutions.