In March 2026, Google updated its crawler documentation with a number that caught a lot of attention: 2 MB. That's the maximum amount of HTML Googlebot will process per URL. The previous documented limit was 15 MB. Naturally, the SEO world reacted with a mix of alarm and hot takes.

Here's the thing: this probably isn't as dramatic as the LinkedIn posts make it sound. But it's worth understanding what actually changed and whether your site is affected.

What changed

Google corrected its documentation to reflect what Googlebot has likely been doing for a while. The 15 MB figure was in the docs for years, but actual crawling behavior suggested a much lower threshold. Google's March 2026 blog post confirmed: Googlebot fetches up to 2 MB of uncompressed data per HTML URL.

This isn't a sudden technical change — it's a documentation fix. But it matters because it gives us a concrete number to work with for the first time.

The actual limits

Google crawl size limits (2026)

  • HTML pages (Googlebot): 2 MB uncompressed
  • PDF files: 64 MB
  • Other resources (CSS, JS, images): 15 MB default for crawlers without a specific limit

The 2 MB applies to the raw, uncompressed HTML — including everything inline: CSS, JavaScript, SVGs, base64-encoded images. Externally linked resources (separate CSS files, JS files, images) are fetched separately, each with their own limit.

If your HTML exceeds 2 MB, Google doesn't skip the page entirely. It truncates — meaning it processes the first 2 MB and ignores the rest. Content, structured data, or links that fall after the 2 MB mark simply won't be indexed.

Who's actually affected

Almost nobody. According to a Seobility analysis of 44.5 million pages, only 0.82% exceed 2 MB of HTML. The median HTML size across the web is just 20 KB. Even at the 90th percentile, pages are only around 392 KB — less than 20% of the limit.

The sites most likely to hit the 2 MB wall are:

If your site is a standard business site, blog, or even a mid-size e-commerce shop — you're almost certainly fine.

How to check your pages

The quickest way to check is to look at the HTML response size of your key pages. There are several ways to do this:

Browser DevTools

Open Chrome DevTools (F12), go to the Network tab, reload the page, click the main document request, and check the Response size. Make sure you're looking at the uncompressed size, not the transfer size (which is usually gzip-compressed and much smaller).

Lumina Tech Stack Detector

We just added an HTML Size metric to the Lumina Tech Stack Detector. Run any URL and you'll see the HTML size with color-coded feedback: green if you're well under the limit, yellow if you're approaching it, red if you've exceeded it.

Command line

curl -s https://example.com | wc -c

This gives you the byte count of the raw HTML response.

What to do if you're over the limit

If you actually have pages exceeding or approaching 2 MB, here are the highest-impact fixes:

Don't over-optimize for this. If your pages are under 500 KB — and most are — the 2 MB limit is irrelevant to you. Focus your technical SEO effort where it actually moves the needle.

Check your HTML size in one click

The Lumina Tech Stack Detector now shows HTML size alongside detected technologies. Green means safe, yellow means watch it.

Try Tech Stack Detector