The sitemap.xml file helps search engines discover your pages. Google considers sites "small" if they have 500 pages or fewer—you likely need a sitemap if you exceed this, have new sites with few backlinks, or update content frequently.
For small sites under 100 pages, create a static sitemap in your public directory:
public/
sitemap.xml
Add your URLs with proper formatting:
<?xml version="1.0" encoding="UTF-8"?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
<url>
<loc>https://mysite.com/</loc>
<lastmod>2024-11-03</lastmod>
</url>
<url>
<loc>https://mysite.com/about</loc>
<lastmod>2024-12-10</lastmod>
</url>
</urlset>
Sitemaps are limited to 50,000 URLs or 50MB uncompressed. Use UTF-8 encoding and absolute URLs only.
For most Vue projects, use vite-plugin-sitemap to auto-generate sitemaps during build:
import Sitemap from 'vite-plugin-sitemap'
export default defineConfig({
plugins: [
Sitemap({
hostname: 'https://mysite.com',
dynamicRoutes: [
'/blog/post-1',
'/blog/post-2'
]
})
]
})
After running npm build, this generates sitemap.xml and robots.txt in your dist folder.
If using Vue Router, generate routes from your router configuration:
import Sitemap from 'vite-plugin-sitemap'
import routes from './src/router/routes'
export default defineConfig({
plugins: [
Sitemap({
hostname: 'https://mysite.com',
dynamicRoutes: routes.map(route => route.path)
})
]
})
Avoid hash-based routing (/#/about)—search engines can't process sitemap entries with hash links. Use history mode instead.
For sites with frequently changing content (e-commerce, blogs, news), generate sitemaps server-side:
import express from 'express'
const app = express()
app.get('/sitemap.xml', async (req, res) => {
const pages = await fetchAllPages()
const urls = pages.map(page => `
<url>
<loc>https://mysite.com${page.path}</loc>
<lastmod>${page.updatedAt}</lastmod>
</url>`).join('\n')
const sitemap = `<?xml version="1.0" encoding="UTF-8"?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
${urls}
</urlset>`
res.type('application/xml').send(sitemap)
})
// server.js for Vite SSR
import express from 'express'
const app = express()
app.use(async (req, res, next) => {
if (req.path === '/sitemap.xml') {
const pages = await fetchAllPages()
const urls = pages.map(page => `
<url>
<loc>https://mysite.com${page.path}</loc>
<lastmod>${page.updatedAt}</lastmod>
</url>`).join('\n')
const sitemap = `<?xml version="1.0" encoding="UTF-8"?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
${urls}
</urlset>`
return res.type('application/xml').send(sitemap)
}
next()
})
import { defineEventHandler, setHeader } from 'h3'
export default defineEventHandler(async (event) => {
if (event.path === '/sitemap.xml') {
const pages = await fetchAllPages()
const urls = pages.map(page => `
<url>
<loc>https://mysite.com${page.path}</loc>
<lastmod>${page.updatedAt}</lastmod>
</url>`).join('\n')
const sitemap = `<?xml version="1.0" encoding="UTF-8"?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
${urls}
</urlset>`
setHeader(event, 'Content-Type', 'application/xml')
return sitemap
}
})
This works with any Node.js server or SSR framework. For SPAs without SSR, use build-time generation instead—client-side rendering makes sitemaps harder since you need to know all routes at build time.
A basic sitemap uses these elements:
<?xml version="1.0" encoding="UTF-8"?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
<url>
<loc>https://mysite.com/page</loc>
<lastmod>2024-11-03</lastmod>
</url>
</urlset>
| Tag | Status | Google Uses? |
|---|---|---|
<loc> | Required | Yes—the page URL |
<lastmod> | Recommended | Yes—if consistently accurate |
<changefreq> | Skip | No—Google ignores this |
<priority> | Skip | No—Google ignores this |
Google only uses <lastmod> if it matches reality. If your page changed 7 years ago but you claim it updated yesterday, Google stops trusting your sitemap.
Don't bother with <changefreq> or <priority>—they increase file size without adding value.
If you exceed 50,000 URLs or 50MB, split your sitemap into multiple files:
<?xml version="1.0" encoding="UTF-8"?>
<sitemapindex xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
<sitemap>
<loc>https://mysite.com/sitemap-products.xml</loc>
<lastmod>2024-11-03</lastmod>
</sitemap>
<sitemap>
<loc>https://mysite.com/sitemap-blog.xml</loc>
<lastmod>2024-12-10</lastmod>
</sitemap>
</sitemapindex>
Submit the index file to Google Search Console—it will crawl all referenced sitemaps. Maximize URLs per sitemap rather than creating many small files.
News publishers should create separate news sitemaps with articles from the last 2 days:
<?xml version="1.0" encoding="UTF-8"?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9"
xmlns:news="http://www.google.com/schemas/sitemap-news/0.9">
<url>
<loc>https://mysite.com/article</loc>
<news:news>
<news:publication>
<news:name>Site Name</news:name>
<news:language>en</news:language>
</news:publication>
<news:publication_date>2024-12-10T12:00:00+00:00</news:publication_date>
<news:title>Article Title</news:title>
</news:news>
</url>
</urlset>
Remove articles older than 2 days to keep the sitemap fresh. Don't create new sitemaps daily—update the existing one.
For galleries or image-heavy sites, add image metadata:
<?xml version="1.0" encoding="UTF-8"?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9"
xmlns:image="http://www.google.com/schemas/sitemap-image/1.1">
<url>
<loc>https://mysite.com/page</loc>
<image:image>
<image:loc>https://mysite.com/image.jpg</image:loc>
<image:title>Image Title</image:title>
</image:image>
</url>
</urlset>
Image sitemaps help Google find images loaded via JavaScript or not directly linked in HTML.
After generating your sitemap, submit it to Google Search Console:
https://mysite.com/sitemap.xmlGoogle processes the sitemap and reports status:
Check the Sitemaps report for coverage stats, indexing errors, and discovered URLs.
Validate your sitemap:
<lastmod> dates are accurateReference your sitemap in robots.txt:
User-agent: *
Allow: /
Sitemap: https://mysite.com/sitemap.xml
Google discovers this automatically when crawling your robots.txt.
If you're using Nuxt, check out Nuxt SEO which handles much of this automatically.