Google renders JavaScript to index content, but the rendering pipeline introduces delays and failure points that pure HTML pages avoid. Googlebot processes pages in two waves: first it crawls the raw HTML, then it queues the page for rendering in a headless Chromium instance. The rendering queue can delay indexing by seconds to days depending on crawl demand.
How Googlebot Renders JavaScript
- Crawl phase: Googlebot fetches the raw HTML and discovers links in the source
- Render queue: The page waits in a rendering queue (variable delay)
- Render phase: A headless Chromium instance executes JavaScript and captures the final DOM
- Indexing: The rendered HTML is indexed and content is extracted
Googlebot uses a relatively current version of Chromium, so modern JavaScript features (ES2020+, async/await, fetch API) work. However, rendering consumes significant resources, and Google cannot guarantee every page in the render queue gets processed promptly.
Client-Side Rendering (CSR) Risks
Single-page applications that render everything in JavaScript create several SEO problems:
- Empty initial HTML. Googlebot sees a
<div id="root"></div>in the first crawl wave and must wait for rendering to discover content. - Broken internal links. Client-side routers that use
history.pushStatewithout corresponding server routes return 404s when Googlebot fetches URLs directly. - Missing meta tags. If
<title>and<meta name="description">are set via JavaScript, they may not be present when the page is first crawled.
Server-Side Rendering (SSR) as the Fix
SSR generates full HTML on the server before sending it to the browser. Googlebot gets complete content in the first crawl wave:
// Next.js example: server-side rendering
export async function getServerSideProps(context) {
const data = await fetchProductData(context.params.id);
return { props: { product: data } };
}
Static Site Generation (SSG) is even better for content that does not change per-request. The HTML is pre-built at deploy time, resulting in zero server processing time:
// Astro: static by default, zero JavaScript shipped
---
const products = await fetch('https://api.example.com/products').then(r => r.json());
---
<ul>
{products.map(p => <li>{p.name}</li>)}
</ul>
Testing JavaScript Rendering
Google Search Console URL Inspection
The URL Inspection tool shows exactly what Googlebot sees after rendering. Compare the "HTML" tab (raw source) with the "Screenshot" tab (rendered output). If critical content only appears in the screenshot view, it depends on JavaScript rendering.
Chrome DevTools with JavaScript Disabled
Disable JavaScript in DevTools (Settings > Debugger > Disable JavaScript) and reload your page. Whatever is missing is content that search engines must render to discover.
Comparing Source vs Rendered HTML
# Fetch raw HTML (what Googlebot sees in crawl phase)
curl -s https://example.com/page | grep -c '<h1>'
# Compare with rendered content using Puppeteer
# If the H1 count differs, content depends on JavaScript
Hydration Errors
When SSR HTML does not match what the client-side JavaScript produces, React and other frameworks throw hydration errors and may re-render the entire page. This causes:
- Content flashing (visible to users and Googlebot)
- Wasted rendering time
- Potential CLS issues
Fix hydration mismatches by ensuring server and client render identical initial output. Common causes include date/time formatting (server timezone vs client timezone), random IDs, and browser-only APIs used during initial render.
Critical Rules for JavaScript SEO
- Every indexable page must return complete
<title>,<meta description>, and<h1>in the initial HTML response - Internal links must use standard
<a href="">tags, not JavaScript click handlers - Canonical tags and hreflang must be in the server-rendered HTML, not injected by JavaScript
- Avoid
#fragment URLs for navigation -- Googlebot ignores URL fragments
Performance Budget for JavaScript
Target under 300KB of compressed JavaScript per page. Pages exceeding 500KB risk render timeouts in Googlebot and poor TBT/INP scores for users. Use the Chrome DevTools Coverage tab to identify unused JavaScript and remove or defer it.