Everyone assumes that big tech companies have flawless SEO. They have dedicated teams, massive budgets, and the best engineers in the world. Their websites must be perfectly optimized, right?
I decided to test that assumption. I scanned five well-known JavaScript-heavy sites — react.dev, vercel.com, stripe.com/docs, linear.app, and shopify.com — using automated SEO checks that analyze what Googlebot actually sees versus what users see in their browser.
The results were surprising. Every single site had issues. Some had a lot of them.
The Sites and Their Scores
Each site was scanned across 10 pages with 19 SEO health checks per page, covering meta tags, structured data, headings, images, internal linking, page speed, and crawlability.
react.dev — 74/100 (Best of the group)
React's own documentation scored highest. Internal linking was strong with an average of 3.5 links per page and zero orphan pages. All pages were indexable with proper canonical URLs and Open Graph tags.
The weak spots: zero structured data across every page scanned, meta descriptions too short on 9 out of 10 pages, and a JavaScript file that failed to load on the /versions page. The irony of React's docs having a broken JavaScript file was hard to ignore.
vercel.com — 71/100 (Solid fundamentals, sloppy details)
Vercel, the company behind Next.js, gets server-side rendering right. All pages are indexable, content depth is good, and Open Graph tags are present everywhere. But the details slip: a missing meta description on /abuse, a missing canonical URL on /academy, a missing H1 heading on a subpage, and 9 orphan pages out of 10 scanned.
For a company that sells deployment infrastructure and promotes SEO best practices, the internal linking was surprisingly weak — 0.0 average links per page in the scanned set.
stripe.com/docs — 60/100 (Surprising for Stripe)
I expected Stripe to ace this. They nail the fundamentals — every page has titles, meta descriptions, H1 headings, canonical URLs, Open Graph tags, and proper heading hierarchy. Zero orphan pages with decent internal linking.
But then: zero structured data on all 10 pages, which is a huge missed opportunity for documentation that could show rich snippets in search results. 7 out of 10 pages had images without alt text. 8 out of 10 loaded slowly at over 3 seconds. And API requests failed on every single page scanned, meaning some content may not be loading for crawlers at all.
linear.app — 57/100 (The SPA showing its seams)
Linear is a beautifully designed product, but the SEO tells a different story. Zero structured data, every meta description too short, 8 out of 10 titles too short, all 10 pages slow to load, and 4 orphan pages.
The most telling stat: the internal link average was just 0.5 links per page. This suggests the single-page application architecture isn't generating proper crawlable links between pages. JavaScript console errors appeared on 2 pages and API requests failed on 2 more.
shopify.com — 39/100 (The biggest surprise)
Shopify scored lowest, and it's the biggest company in the group. The crawler landed on their Dutch locale pages, which revealed issues you'd never catch checking only the English site.
8 out of 10 pages were orphaned with no internal links pointing to them. A login page got crawled and correctly flagged as noindex, but it consumed one of the scan slots. API requests failed on 7 out of 10 pages. Missing H1 headings on 2 pages. No structured data on 8 out of 10 pages. Even Shopify's own website has SEO gaps, which is humbling for a company that sells e-commerce tools.
The Patterns That Appeared Everywhere
After looking at the results across all five sites, several patterns stood out.
Structured Data Is Universally Neglected
4 out of 5 sites had zero Schema.org markup on every page scanned. Structured data is what enables rich snippets in search results — star ratings, FAQ dropdowns, breadcrumbs, and other enhanced listings. It takes 20 to 30 minutes to implement per page type, and it's one of the highest-ROI SEO investments you can make. Yet almost nobody does it.
For documentation sites like Stripe and React, adding structured data for articles or how-to guides could significantly improve their search presence. For a product site like Linear, Organization and SoftwareApplication schemas would help Google understand what the site actually is.
Meta Descriptions Are an Afterthought
Short, generic, or missing meta descriptions were everywhere. These are the text snippets that appear below your page title in Google search results. They directly affect whether someone clicks on your result or scrolls past it.
Most of the meta descriptions found were under 120 characters — too short to be compelling. The ideal length is 150 to 160 characters, giving you enough space to explain what the page offers and include a call to action.
Image Alt Text Is Consistently Missing
Every single site had pages with images lacking alt text. This affects both accessibility (screen readers can't describe the image) and SEO (Google uses alt text to understand image content for image search).
The fix takes about 2 minutes per image. It's one of the easiest SEO improvements you can make, yet it's skipped consistently even by the most well-resourced teams.
Internal Linking Is Weak on SPAs
Linear and Shopify both had most pages orphaned or poorly linked. This means search engines have difficulty discovering those pages through crawling. Server-rendered sites like Stripe and React had significantly better link structures because the links exist in the initial HTML response, not generated by JavaScript after the page loads.
If your site uses client-side routing, check whether your navigation generates real anchor tags with href attributes. Many JavaScript frameworks use click handlers or programmatic navigation that crawlers can't follow.
Page Speed Is a Universal Problem
Most pages across all five sites took over 3 seconds to fully load. JavaScript-heavy sites consistently struggle here because the browser needs to download, parse, and execute large bundles before the page is interactive.
Page speed is a confirmed Google ranking factor, and it directly affects user experience. The sites with the fastest load times tended to be the ones with better server-side rendering — the HTML arrives ready to display without waiting for JavaScript execution.
AI Crawlers See Even Less
All of these scores reflect what Googlebot sees after JavaScript rendering. The situation is worse for AI crawlers.
Crawlers from ChatGPT, Perplexity, and other AI search tools don't render JavaScript at all. They fetch the raw HTML and that's it. No script execution, no API calls, no client-side rendering. Any content that depends on JavaScript to appear is completely invisible to AI search.
This means the gap between user experience and crawler experience is even larger than these scores suggest. Sites using client-side rendering are not just delayed in Google's index — they're entirely absent from AI-powered search results.
What You Can Do About It
If you're building with React, Next.js, Vue, or any JavaScript framework, here's where to start:
Check your page source. Right-click on any page, select View Page Source, and look at the raw HTML. If your content isn't there, crawlers aren't seeing it on the first pass.
Add structured data. Pick the most relevant Schema.org type for your pages — Article for blog posts, Product for e-commerce, SoftwareApplication for SaaS — and add JSON-LD markup to your pages.
Write proper meta descriptions. 150 to 160 characters, unique per page, with a clear value proposition. This is a 5-minute fix per page.
Add alt text to every image. Describe what the image shows in plain language. Two minutes per image.
Fix internal linking. Make sure every important page has at least 2 to 3 internal links pointing to it from other pages. Use real anchor tags, not JavaScript click handlers.
Use server-side rendering for important pages. If SEO matters for a page, make sure the content is in the initial HTML response. This is especially critical now that AI crawlers don't render JavaScript.
The Bottom Line
Even the best engineering teams overlook SEO fundamentals. Structured data, meta descriptions, alt text, and internal linking are not glamorous, but they directly affect whether people find your site through search.
The gap between what users see and what crawlers see is real, and it's bigger than most developers think. The first step to fixing it is knowing it exists.
Note: these results are based on 10-page scans — a snapshot, not a full site audit. But the patterns are consistent and the issues are real. You can verify any of them by viewing page source on these sites yourself.
Want to see what Google sees on your site?
JSVisible renders your pages as both a user and Googlebot, side by side. 35+ SEO checks, screenshot comparisons, and plain-English fixes. Free to start.
Try a Free Scan