Google Search Console shows which pages Google indexed, what queries bring traffic, and what's broken. Every site needs this—without it you're blind to 80% of indexing issues.
Visit search.google.com/search-console and add a property. You get two types:
Domain property (example.com):
www, m, blog)URL-prefix property (https://example.com):
Use Domain property unless you only control a subdomain. It captures all traffic variations without managing separate properties.
Google needs proof you own the site. Pick one method and don't remove it—Google checks verification periodically.
Add a TXT record to your domain's DNS settings. This is the only method for Domain properties and the most reliable overall.
Steps:
google-site-verification=abc123xyzDNS changes take up to 48 hours but verification works immediately once the record propagates. This method eliminates re-verification if you switch between www and non-www URLs.
Type: TXT
Name: @
Content: google-site-verification=abc123xyz
TTL: Auto
Record type: TXT
Name: example.com
Value: "google-site-verification=abc123xyz"
TTL: 300
Add a <meta> tag to your site's homepage <head>. Works for URL-prefix properties.
useHead({
meta: [
{ name: 'google-site-verification', content: 'abc123xyz' }
]
})
The tag must appear in the HTML source—check with View Page Source (not DevTools). Nuxt's SSR ensures the tag is present in the initial HTML response.
Common issue: If you use a layout component for the tag, make sure it renders on every page including the homepage.
Upload google-site-verification.html to your site's root directory. Works for URL-prefix properties.
Steps:
public/ directoryhttps://example.com/google-site-verification.htmlDon't delete this file after verification—Google re-checks it.
If you already use Google Analytics with the GA4 tracking code on your homepage, Search Console can verify via that tag.
Requirements:
<head> sectionThis is the fastest method if you already have analytics—no code changes needed.
Same concept as Google Analytics. If you have GTM installed and publishing, Search Console auto-detects it for verification.
Requirements:
After verification, tell Google where to find your pages:
https://example.com/sitemap.xmlStatus meanings:
Check back in 24-48 hours for discovered page counts. If Google found 0 URLs but your sitemap has URLs, your XML syntax is broken or the URLs return non-200 status codes.
You can reference your sitemap in robots.txt as an alternative submission method—Google auto-discovers it when crawling.
Shows the last 16 months of search data:
Metrics:
Filter by:
Use this to find low-CTR pages with high impressions—they rank well but have poor titles or descriptions. Fix the meta tags and watch CTR climb.
Replaced the old Coverage report in 2025. Shows indexing status for all discovered URLs:
Categories:
Click any category to see specific reasons like "Duplicate content", "Soft 404", or "Crawled but not indexed". The June 2025 core update aggressively deindexed low-quality pages—this report shows the damage.
Common issues:
Enter any URL to see:
Click Request Indexing to ask Google to crawl the URL immediately. You get ~10 requests per day—use them for new or updated critical pages.
The rendered screenshot is gold for debugging—if it's blank or shows loading spinners, Google couldn't execute your JavaScript properly.
After publishing new content:
Google typically crawls within hours but sometimes takes days. Requesting indexing doesn't guarantee it—Google still evaluates quality.
Need to hide a URL from search results quickly?
Removals last approximately 6 months. For permanent removal, return 404/410 status or add noindex meta tag, then request removal.
If Google detects malware or hacking, you'll see alerts in Search Console. Fix the issue first (remove malware, update plugins, patch vulnerabilities), then request a security review under Security Issues.
Ignore these warnings and Google may deindex your entire site to protect users.
"Index Coverage report delayed since Nov 2025":
This is a known bug. Google confirmed it's a reporting issue only—actual crawling and indexing still work. Use the URL Inspection tool for real-time checks on individual pages.
Sitemap shows "Discovered - currently not indexed":
Google found the URLs but chose not to index them. Reasons include:
Fix content quality issues before re-submitting. Don't spam "Request Indexing"—it doesn't override quality filters.
Verification suddenly fails:
You accidentally removed the verification method. Common causes:
<head> tagsRe-verify using the original method.