Single page applications render content with JavaScript after the page loads. This creates a timing problem: search engines see an empty page before your JavaScript runs.
Google can render JavaScript, but it queues pages for rendering separately from crawling. Your page may wait seconds or longer before rendering happens. During this delay, other crawlers (Bing, social platforms, AI bots) often see nothing.
Traditional HTML sites — Server sends complete HTML. Crawler sees content immediately.
Vue SPAs — Server sends minimal HTML with <div id="app"></div>. JavaScript downloads, executes, then renders content. Crawlers must wait and execute JavaScript to see anything.
Google handles this better than other crawlers, but even Googlebot faces challenges. JavaScript rendering requires substantial computing capacity—downloading, parsing, and executing scripts takes more resources than reading static HTML.
Meta tags render too late — Your useSeoMeta() calls run after JavaScript loads. Initial HTML has no title, description, or Open Graph tags. Social platforms and preview generators see nothing.
Content isn't indexed — If rendering fails or times out, Google indexes an empty page. Even when successful, delayed indexing is common.
Client-side routing hides pages — SPAs typically use URL fragments (#/about) or JavaScript-only routing. Crawlers may not discover all your routes.
Performance suffers — JavaScript bundles delay First Contentful Paint. Search engines factor Core Web Vitals into rankings.
Not every Vue app needs server-side rendering. SPAs work when:
Your app requires authentication — Admin panels, dashboards, internal tools. Block these from indexing with meta robots tags anyway.
You don't need search traffic — Apps used through direct links or bookmarks. No need to rank in Google.
You only share via direct links — If users share app URLs to logged-in colleagues, Open Graph previews don't matter.
Content is user-generated and private — Chat apps, project management tools, personal data views.
For these cases, skip the SSR complexity. Focus on app performance and user experience instead.
Server-side rendering matters when:
Public content needs indexing — Marketing sites, blogs, documentation, product pages, landing pages. If Google should index it, you need SSR.
Social sharing matters — Link previews on Twitter, LinkedIn, Slack require meta tags in initial HTML. SPAs fail here without SSR or prerendering.
Performance is critical — SSR delivers faster First Contentful Paint. Users see content before JavaScript loads. This improves both user experience and search rankings.
You want predictable indexing — SSR guarantees search engines see your content. No waiting for JavaScript execution. No risk of rendering failures.
Renders pages on server for every request. Guarantees search engines and users get complete HTML immediately.
Best for: Dynamic content, frequent updates, personalized pages, anything requiring authentication combined with public pages.
Tools: Nuxt (recommended), Quasar SSR, custom Vite SSR.
Trade-offs: Requires Node.js server, more complex deployment, higher server costs.
Generates static HTML at build time for known routes. Serves static files to crawlers and users.
Best for: Marketing sites, blogs, documentation—content that doesn't change per request.
Tools: vite-ssg, prerender-spa-plugin.
Trade-offs: Only works for routes you know at build time. Content updates require rebuilding. Can't handle dynamic routes like /users/:id.
Serves prerendered HTML to bots, JavaScript app to users. Google deprecated this approach in 2024 but it still works.
Best for: Migration path when you can't implement full SSR yet.
Tools: Prerender.io, Rendertron.
Trade-offs: Requires bot detection (user agent sniffing), maintains two versions of your site, may be seen as cloaking if implementations differ.
Test your SPA to understand what crawlers see:
Chrome DevTools — Disable JavaScript in DevTools settings. Reload your page. This is what non-Google crawlers see.
View Page Source — Right-click → View Page Source. This is the initial HTML Google receives before JavaScript runs.
Google Search Console — URL Inspection tool shows how Google rendered your page. Compare "crawled page" vs "live page."
Mobile-Friendly Test — Google's testing tool renders JavaScript and shows the result.
If your content doesn't appear in these tests without JavaScript, search engines struggle with your site.
Even with Google's JavaScript rendering, meta tags need to be in initial HTML for:
Social platforms — Twitter, Facebook, LinkedIn, Slack don't run JavaScript. They only see initial HTML.
Speed — Google uses meta tags from initial HTML when available, even if they change during rendering.
Reliability — JavaScript execution can be blocked or delayed. Initial HTML guarantees tags are present.
<script setup>
// ❌ Meta tags only exist after JavaScript runs
useSeoMeta({
title: 'My SPA Site',
description: 'This description only exists client-side'
})
</script>
Initial HTML:
<!DOCTYPE html>
<html>
<head>
<!-- Empty! No meta tags. -->
</head>
<body>
<div id="app"></div>
<script src="/app.js"></script>
</body>
</html>
With server-side rendering, useSeoMeta() runs on the server:
<!DOCTYPE html>
<html>
<head>
<title>My SPA Site</title>
<meta name="description" content="This description exists immediately">
<meta property="og:title" content="My SPA Site">
</head>
<body>
<div id="app"><!-- Server-rendered content --></div>
<script src="/app.js"></script>
</body>
</html>
SPAs change pages without reloading the browser. This creates problems:
URL fragments ignored — URLs like yoursite.com#/about look like yoursite.com to search engines. The #/about is a browser-only detail. Google ignores it for indexing.
History API works — Modern routers (Vue Router) use History API with real URLs (yoursite.com/about). This works for SEO if you have SSR or prerendering.
Without SSR — Every route returns the same empty HTML. Google discovers routes from links and sitemaps, but sees the same empty content for all URLs.
With SSR — Each route renders its own HTML. Google sees different content per URL. This works correctly.
Use Vue Router with History mode, not hash mode:
// ❌ Hash mode - SEO doesn't work
createRouter({
history: createWebHashHistory(),
routes: [/* ... */]
})
// ✅ History mode - works with SSR/prerendering
createRouter({
history: createWebHistory(),
routes: [/* ... */]
})
Hash mode URLs (#/about) can't be distinguished by servers. History mode URLs (/about) work like traditional websites.
Note: History mode requires server configuration to handle direct URL access. All routes must serve your index.html.
Even with SSR, generate a sitemap:
Static routes — List all routes in public/sitemap.xml:
<?xml version="1.0" encoding="UTF-8"?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
<url>
<loc>https://yoursite.com/</loc>
<lastmod>2025-12-17</lastmod>
</url>
<url>
<loc>https://yoursite.com/about</loc>
<lastmod>2025-12-17</lastmod>
</url>
<url>
<loc>https://yoursite.com/pricing</loc>
<lastmod>2025-12-17</lastmod>
</url>
</urlset>
Dynamic routes — Generate sitemap from your data source during build or on demand. See sitemaps guide for details.
Google's official guidance on JavaScript and SEO:
If you're using Nuxt, most of this is handled automatically. Nuxt provides SSR by default, automatic sitemap generation, and proper meta tag handling.