What was &num=100 anyway?
In Google search URLs, there used to be a “hidden” or semi-undocumented parameter called num that let you request more than the default number of search results on one page. For example: https://www.google.com/search?q=example&num=100 would (in many cases) return 100 organic results in a single page instead of Google’s usual ~10 per page.
This was never officially supported (i.e. Google never prominently advertised it), but it worked consistently enough that many SEO tools, rank trackers, and power users adopted it.
It was especially useful for:
- Rank tracking tools to fetch up to 100 results with one request.
- Scrapers and tools that needed to examine deep SERP data.
- Quick manual checks by SEOs wanting a long list of results.
It essentially let someone “see deeper” into Google’s results more efficiently.
What changed: Google disables / drops support for
num=100
Sometime around mid-September 2025 (reports generally point to September 11–14 as the key dates), Google began blocking or ignoring the &num=100 parameter. Many queries that include &num=100 now revert to returning only ~10 results, or are inconsistently honored depending on context (login status, region, etc.).
Search Engine Land confirmed that Google responded to inquiries by saying: “The use of this URL parameter is not something that we formally support.”
Thus, it appears intentional rather than a bug.
Some observations from the change:
- The parameter sometimes still “works” (i.e. returns many results) in certain contexts (signed-out vs signed-in, browser type, region) — which suggests a staged rollout or A/B testing.
- Tools and APIs built around bulk SERP fetching are either partially broken or need to adjust their fetching logic.
- Some API providers (for SEO / SERP data) have confirmed that even if num=100 is specified, they may only get 10 results in return.
So, the effect is that Google is actively clamping down on that “shortcut” to see many results in one request.
Why this matters (and why Google might have done it)
To a casual searcher, this probably feels negligible — you rarely scroll past page 1 or 2. But for SEOs, data analysts, and tool vendors, this is a big deal. Here are several reasons why it matters and possible motivations from Google’s side.
Because many SEO tools relied on fetching up to 100 results in one go, removal of num=100 means:
- They now have to issue multiple requests (e.g. 10 separate “pages” of 10 results each) to cover the same depth, increasing complexity, latency, cost, and load.
- Some tools may reduce how deep they fetch (e.g. only top 50 or top 20 instead of 100) to avoid ballooning costs.
- Dashboards or reports expecting full 100-depth data may now show gaps, missing rankings, or “lost” keywords.
2. Cleaner (or at least different) Search Console data
One of the more surprising consequences is that many websites reported sharp drops in impressions in Google Search Console (GSC), especially on desktop, and changes in average position metrics.
Why? One strong hypothesis in the SEO community is that bots and tools were triggering “impressions” in GSC by fetching deep results (positions 11–100) via num=100. Since GSC counts an impression whenever your URL appears on the user’s SERP page (even if the user doesn’t scroll), those bot requests inflated impression counts.
By removing num=100, these “non-human” impressions mostly drop away, causing:
- A dramatic, “overnight” drop in reported impressions.
- A shift upward in average position (because fewer low-ranking impressions are counted).
- More “realistic” data that more closely matches genuine human traffic.
In fact, an SEO dataset showed 87.7% of sites saw declines in impressions after this change, and 77.6% lost unique ranking terms.
3. Mitigating scraping / bot abuse
By limiting how many results can be fetched in one shot, Google raises the friction for automated scrapers, bots, and AI systems that try to “fan out” across many keywords or fetch entire result sets. Some theorists speculate that this move is partially aimed to:
- Discourage AI and LLM systems (like chatbots) from heavily scraping SERPs via brute force.
- Reduce server load or data abuse associated with large-scale scraping.
- Regain control over SERP data exposure and how third parties access deeper results.
SEO and analytics strategies that implicitly assumed visibility into positions beyond page 1–2 are affected. Tools, dashboards, and reporting setups need to be reconfigured to adapt to fewer available depth data.
For example:
- Before, you might track keywords in positions 1–100; now you might shift to focusing on positions 1–20 or 1–50.
- Historical comparisons will look skewed: if old tools fetched full 100-depth data and new ones don’t, metrics may appear artificially worse post-change.
- Some SEOs and tool providers are calling this the “Great #=100 debacle.”
If this change affects your tools, reporting, or how you monitor performance, here are practical steps to adapt:
1. Don’t panic (yet)
- The change is likely not targeting your site’s rankings — human traffic and click behavior are unchanged.
- The disruption mostly affects data gathering, tracking, and reporting tools.
- Google’s statement suggests they no longer support the parameter—it’s possible further changes or rollbacks may happen.
2. Re-evaluate your “depth” needs
- Do you really need to track / fetch 100 positions for every keyword? For many businesses, the top 10–20 rankings hold most of the value.
- Trim your tool settings or dashboards to focus on what truly matters (e.g., positions 1–20 or 1–50).
- Accept that deeper tracking (positions 50–100) may become more of a “bonus” rather than baseline.
- Choose tools or vendors that have adapted to the new paradigm (i.e. they fetch deeper results via multiple requests, or offer smart caching / pagination).
- Monitor for updates from your tool vendors — many are issuing patches, pricing changes, or new depth defaults.
- If you build your own scrapers or use APIs (SERP APIs, etc.), modify your logic to fetch page-by-page (e.g. “start=0, start=10, start=20 …”) instead of relying on one large num=100 call.
- Watch your costs — more API calls, more bandwidth, more processing — these may need optimization.
4. Use Search Console (and human metrics) more heavily
- Because Search Console now presumably reflects more human-driven impressions, lean more heavily on GSC data (clicks, queries, positions) for performance monitoring.
- Use annotations / change logs to mark the “&num=100 retirement” in your tracking timeline, so sudden drops or shifts don’t confuse future analysis.
- Validate that traffic or conversions are stable — if your site is still getting similar user behavior, then the drop in impressions is mostly a reporting artifact.
5. Communicate with stakeholders
- Let your clients, team, or stakeholders know: some metrics may look weaker (impressions down, average position up) even though nothing “bad” has happened.
- Explain that this is a systemic, external change affecting all SEO tools, not a site-specific issue.
- Use this as opportunity to emphasize high-value metrics (traffic, conversions, keyword intent) over raw ranking depth.
Final thoughts & implications
Google’s decision to disable &num=100 is more than a technical tweak — it signals a shift in how search data is exposed, consumed, and protected. For the SEO industry, it’s a jolt: toolmakers need to recalibrate, analysts need to interpret their dashboards differently, and teams need to adjust expectations.
However, the long-term outcome may be healthier data (less bot “noise”), more cost-efficient tool usage (focusing on high-impact positions), and an evolution in how we think about SERP insights.