Dynamic rendering detects crawler requests and serves them pre-rendered HTML, while users receive client-side rendered content. Google explicitly calls this a "workaround" and no longer recommends it as of 2025—use SSR or SSG instead.
Use only when SSR/SSG prove impractical due to legacy constraints. Modern frameworks make server-side rendering straightforward, eliminating the need for this complexity.
Google positions dynamic rendering as error-prone, increasing server load and maintenance burden. Serving different content to crawlers versus users risks cloaking violations if implementations diverge.
User-agent detection is fragile—new crawlers break assumptions, AI bots go unrecognized, and maintenance never ends. John Mueller confirms no ranking benefit exists between dynamic rendering and SSR—they're infrastructure choices, not SEO advantages.
Googlebot handles JavaScript well in 2025. Server-side rendering ensures immediate content visibility without relying on crawler execution behavior.
import express from 'express'
import { chromium } from 'playwright'
import { createServer } from 'vite'
const app = express()
const crawlers = /googlebot|bingbot|slurp|duckduckbot/i
app.use(async (req, res, next) => {
const userAgent = req.get('user-agent')
if (!crawlers.test(userAgent)) {
return next()
}
const browser = await chromium.launch()
const page = await browser.newPage()
await page.goto(`http://localhost:${PORT}${req.path}`)
await page.waitForLoadState('networkidle')
const html = await page.content()
await browser.close()
res.send(html)
})
import express from 'express'
import { chromium } from 'playwright'
const app = express()
const crawlers = /googlebot|bingbot|slurp|duckduckbot/i
app.use(async (req, res, next) => {
if (!crawlers.test(req.get('user-agent'))) {
return next()
}
const browser = await chromium.launch()
const page = await browser.newPage()
await page.goto(`http://localhost:3000${req.path}`)
await page.waitForLoadState('networkidle')
const html = await page.content()
await browser.close()
res.send(html)
})
import { defineEventHandler, getHeader } from 'h3'
import { chromium } from 'playwright'
const crawlers = /googlebot|bingbot|slurp|duckduckbot/i
export default defineEventHandler(async (event) => {
const userAgent = getHeader(event, 'user-agent')
if (!crawlers.test(userAgent)) {
return
}
const browser = await chromium.launch()
const page = await browser.newPage()
await page.goto(`http://localhost:3000${event.path}`)
await page.waitForLoadState('networkidle')
const html = await page.content()
await browser.close()
return html
})
This requires headless browser infrastructure—Chrome binary (~280MB), system dependencies, memory management, error handling. Full production setup takes 4-8 weeks engineering effort.
Google's open-source solution built on Puppeteer. Deploy your own server, customize rendering logic, maintain Chrome updates manually.
Popularity declining—138 weekly npm downloads vs Puppeteer's 6.2M. No Docker file included; refer to Puppeteer docs for deployment.
Free but requires infrastructure. Memory leaks, Chrome crashes, and crawler detection updates fall on you.
Commercial service handling rendering, caching, and crawler detection automatically. Zero maintenance burden—they manage Chrome updates, resource allocation, error recovery.
Cache expiration: 6 hours to 30 days depending on plan. Submit sitemap for automatic refresh. Works with all frameworks without code changes.
Pricing starts $50/month for 10K pages. Worth it if avoiding weeks of engineering work.
Open-source Node library controlling headless Chrome. Full control, zero licensing costs, maximum complexity.
Production deployment requires 30+ system packages on Linux. Must handle caching, scaling, failure recovery. Maintenance never stops—Chrome updates monthly, memory leaks accumulate, edge cases multiply.
Choose if you have spare engineering time and need custom rendering logic SSR can't provide.
Minimal Rendertron setup:
import express from 'express'
import rendertron from 'rendertron-middleware'
const app = express()
app.use(rendertron.makeMiddleware({
proxyUrl: 'https://render-tron.appspot.com/render',
userAgentPattern: /googlebot|bingbot|slurp|duckduckbot|whatsapp|facebookexternalhit|twitterbot/i
}))
import express from 'express'
import rendertron from 'rendertron-middleware'
import { createServer } from 'vite'
const app = express()
const vite = await createServer({ server: { middlewareMode: true } })
app.use(rendertron.makeMiddleware({
proxyUrl: 'https://render-tron.appspot.com/render'
}))
app.use(vite.middlewares)
import { defineEventHandler, getHeader, proxyRequest } from 'h3'
const crawlers = /googlebot|bingbot|slurp|duckduckbot|whatsapp|facebookexternalhit|twitterbot/i
export default defineEventHandler(async (event) => {
const userAgent = getHeader(event, 'user-agent')
if (crawlers.test(userAgent)) {
return proxyRequest(event, `https://render-tron.appspot.com/render/${event.path}`)
}
})
Google's public Rendertron instance is for testing only. Run your own for production.
Valid use cases:
Don't use for:
Modern frameworks make SSR straightforward. Nuxt handles it automatically. Custom Vite SSR takes a weekend. Dynamic rendering adds complexity that rarely pays off.
Serving crawlers different content than users violates Google's webmaster guidelines if the versions diverge. JavaScript errors in user version while crawler gets perfect HTML triggers manual penalties.
Keep rendered output identical:
Safer to fix SSR hydration issues than maintain two code paths.
Verify crawlers receive correct HTML:
# Test with Googlebot user agent
curl -A "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)" \
https://your-site.com
# Should return fully rendered HTML with meta tags
Check for:
Use Google Search Console URL Inspection to see what Googlebot receives. If it differs from user version, you risk penalties.
Moving from dynamic rendering to SSR:
onMounted or check typeof windowDon't attempt big-bang rewrites. Migrate route by route, keep dynamic rendering for uncovered paths.
Nuxt handles server-side rendering automatically. No need for dynamic rendering workarounds.