Hidden Layer/Blog/The render gap: why your website is invisible to AI
GEO Fundamentals5 min read

The render gap: why your website is invisible to AI

Your homepage loads beautifully in a browser. An AI crawler sees a skeleton. Here's why — and how to measure the gap for your own domain.

Modern websites are JavaScript-first. The browser downloads a mostly-empty HTML shell, runs hundreds of kilobytes of JS, calls a dozen APIs, then paints the UI. For a human with a fast laptop this is invisible — the page feels instant.

For an AI crawler, it's a wall.

What AI crawlers actually receive

Most AI training and indexing bots — GPTBot, ClaudeBot, PerplexityBot, Google-Extended — are HTTP crawlers. They fetch a URL, read the response body, and move on. They don't run JavaScript. They don't execute React hydration. They don't wait for API calls to populate content.

So when an AI crawler visits a React or Next.js site that relies on client-side data fetching, it sees the skeleton: a `<div id="__next"></div>` and some script tags. Navigation items rendered by JavaScript? Gone. Product descriptions fetched from an API? Gone. Pricing tables, specifications, structured data injected after mount? Gone.

The AI model trained on this content — or the LLM agent visiting your site to complete a task — works with a stripped, meaningless version of your content.

Server-side rendering closes the gap

Next.js, Nuxt, SvelteKit, and Remix all support server-side rendering (SSR) or static generation (SSG). When you use getServerSideProps, getStaticProps, or the App Router's async server components, the full HTML — including all content — is in the initial HTTP response. That's what the crawler receives.

The fix isn't to add a special AI-friendly mode. It's to ship HTML that means something without JavaScript. SSR and SSG are already the right architecture for performance, SEO, and accessibility. GEO readiness is another reason to get there.

How to measure your render gap

Two approaches:

  1. curl test: `curl -s -A "ClaudeBot/1.0" https://yourdomain.com | grep -c "<p"`. If the paragraph count is near-zero while your live page has dozens of content blocks, you have a gap.
  2. Disable JS in DevTools. Chrome DevTools → Settings → Debugger → Disable JavaScript. Reload. What you see is roughly what AI crawlers see. If the page is blank or barely functional, your render gap is severe.

Hidden Layer's audit checks this automatically — comparing the static HTML response against expected structural markers. A site with a high render-gap score in the AI Visibility category has content locked behind JavaScript that AI agents can't reach.

What about dynamic content?

Not all dynamic content needs to be in the initial render. Product reviews loaded on scroll, personalised recommendations, live inventory — these are fine as client-side. The content that matters for AI is the canonical information: product specs, prices, descriptions, structured data, navigation, contact information.

If that content is in your initial HTML, AI crawlers can read it. If it's hydrated in later, they can't.

Render GapAI CrawlersGEO

See how your domain scores against these checks →

Run a free audit