SSR, SSG, and ISR for AI Crawlers: Why JavaScript-Only Sites Lose Visibility
Why many AI crawlers do not execute JavaScript and how SSR, SSG, and ISR make public content visible to ChatGPT, Claude, Perplexity, and Google AI.
Rendering is a GEO factor because AI models cannot cite content they cannot retrieve. A beautiful SPA can be nearly invisible to GPTBot, ClaudeBot, or PerplexityBot if the server response contains only scripts.
GEO Scout makes this measurable: after migrating public pages from CSR to SSR, SSG, or ISR, teams can measure provider-level citation changes in geoscout.pro.
What Crawlers Receive
Use curl to inspect the actual response:
curl -A "GPTBot/1.0" -s https://example.com/ | head -80If you see this pattern, AI visibility is weak:
<body>
<div id="root"></div>
<script src="/assets/app.js"></script>
</body>The bot receives title and meta description, but not the body content rendered after JavaScript execution.
Rendering Options
| Mode | AI friendliness | Best use |
|---|---|---|
| SSR | High | Dynamic public pages. |
| SSG | Very high | Blog, docs, landing pages, pricing. |
| ISR | Very high | Static pages that need periodic updates. |
| CSR | Low for public pages | Authenticated app screens and dashboards. |
Hybrid Architecture
Most SaaS teams do not need to rewrite the entire product:
/ -> SSG or SSR
/pricing -> SSG or ISR
/blog/[slug] -> SSG
/docs/[slug] -> SSG
/compare/* -> SSG or ISR
/app/dashboard -> CSR behind auth
/app/settings -> CSR behind authPublic pages are for crawlers and prospects. Private app pages are for users.
Structured Data Must Also Be Server-Visible
If FAQPage, HowTo, Article, Product, or BreadcrumbList JSON-LD is injected only after client hydration, many AI crawlers will miss it. Render JSON-LD in the initial HTML.
Cache Headers
For public static pages, prefer cacheable responses:
Cache-Control: public, s-maxage=86400, stale-while-revalidate=3600Avoid no-store for public content unless there is a strong reason. Slow uncached pages can reduce crawl efficiency.
Migration Checklist
- Identify high-value public pages that return empty or thin HTML.
- Move blog, docs, pricing, and comparison pages to SSG or ISR.
- Move dynamic public pages to SSR.
- Render schema and FAQ in initial HTML.
- Validate with curl and server logs.
- Track citation changes in GEO Scout over the next crawl cycle.
Частые вопросы
Do AI crawlers execute JavaScript?
Why is CSR risky for GEO?
Is SSR or SSG better for AI?
Do dashboards need SSR for AI?
How do I test what an AI bot sees?
How can rendering migration be measured?
Related Articles
Breadcrumbs Schema for AI: How Site Hierarchy Helps Neural Search Cite You
How BreadcrumbList helps AI systems understand site architecture, attribute pages correctly, and cite the right section of your website.
Cloudflare AI Audit and Bot Management: How to Control AI Crawlers
How Cloudflare AI Audit, Bot Management, AI Labyrinth, and pay-per-crawl policies help teams allow, limit, or block AI bots.
HowTo Schema for AI Answers: Step-by-Step Markup That Neural Search Can Reuse
How HowTo schema helps ChatGPT, Perplexity, Gemini, and Google AI Overviews extract ordered instructions from your pages.