🎯 Free: get your first AI visibility baseline in 5 min, then refresh it every 7 daysTry it →

Blog
3 min read

SSR, SSG, and ISR for AI Crawlers: Why JavaScript-Only Sites Lose Visibility

Why many AI crawlers do not execute JavaScript and how SSR, SSG, and ISR make public content visible to ChatGPT, Claude, Perplexity, and Google AI.

SSRSSGISRAI crawlers
Vladislav Puchkov
Vladislav Puchkov
Founder of GEO Scout, GEO optimization expert

Rendering is a GEO factor because AI models cannot cite content they cannot retrieve. A beautiful SPA can be nearly invisible to GPTBot, ClaudeBot, or PerplexityBot if the server response contains only scripts.

GEO Scout makes this measurable: after migrating public pages from CSR to SSR, SSG, or ISR, teams can measure provider-level citation changes in geoscout.pro.

What Crawlers Receive

Use curl to inspect the actual response:

curl -A "GPTBot/1.0" -s https://example.com/ | head -80

If you see this pattern, AI visibility is weak:

<body>
  <div id="root"></div>
  <script src="/assets/app.js"></script>
</body>

The bot receives title and meta description, but not the body content rendered after JavaScript execution.

Rendering Options

ModeAI friendlinessBest use
SSRHighDynamic public pages.
SSGVery highBlog, docs, landing pages, pricing.
ISRVery highStatic pages that need periodic updates.
CSRLow for public pagesAuthenticated app screens and dashboards.

Hybrid Architecture

Most SaaS teams do not need to rewrite the entire product:

/                -> SSG or SSR
/pricing         -> SSG or ISR
/blog/[slug]     -> SSG
/docs/[slug]     -> SSG
/compare/*       -> SSG or ISR
/app/dashboard   -> CSR behind auth
/app/settings    -> CSR behind auth

Public pages are for crawlers and prospects. Private app pages are for users.

Structured Data Must Also Be Server-Visible

If FAQPage, HowTo, Article, Product, or BreadcrumbList JSON-LD is injected only after client hydration, many AI crawlers will miss it. Render JSON-LD in the initial HTML.

Cache Headers

For public static pages, prefer cacheable responses:

Cache-Control: public, s-maxage=86400, stale-while-revalidate=3600

Avoid no-store for public content unless there is a strong reason. Slow uncached pages can reduce crawl efficiency.

Migration Checklist

  1. Identify high-value public pages that return empty or thin HTML.
  2. Move blog, docs, pricing, and comparison pages to SSG or ISR.
  3. Move dynamic public pages to SSR.
  4. Render schema and FAQ in initial HTML.
  5. Validate with curl and server logs.
  6. Track citation changes in GEO Scout over the next crawl cycle.

Частые вопросы

Do AI crawlers execute JavaScript?
Many important AI crawlers primarily read the first HTML response and do not execute client-side JavaScript. Google and Bing can render some JavaScript, but that should not be your baseline for AI visibility.
Why is CSR risky for GEO?
A pure client-side rendered SPA often returns an empty root div and scripts. If the bot does not run JavaScript, it cannot see your content, headings, FAQ, or structured data.
Is SSR or SSG better for AI?
SSG is best for stable marketing pages, blogs, and documentation because it serves complete HTML from a CDN. SSR is useful for dynamic pages. ISR combines static HTML with scheduled regeneration.
Do dashboards need SSR for AI?
No. Private authenticated dashboards should usually stay client-side or app-rendered because AI crawlers should not access them. Public pages are the priority.
How do I test what an AI bot sees?
Use curl with a bot user-agent and inspect the raw HTML. If the response has little or no visible text, crawlers probably cannot use the page.
How can rendering migration be measured?
Track baseline AI citations, move public pages to SSR, SSG, or ISR, then compare Mention Rate and Domain Citation Rate in GEO Scout after recrawling.