MKT Link Media

Home / SEO / Google Disables &num=100: What It Means for Your SEO Reports
Google Disables &num=100 Parameter in GSC

Why Do We Care?

This reporting update may raise questions, but it won’t affect real performance. Business goals and SEO ROI remain unchanged.

In mid September 2025, Google removed the &num=100 URL parameter that tools and bots used to pull up to 100 search results per query. Now, data collection is limited to Google’s default 10 results per page.

What Changed:

  • Fewer lower-ranked (page 3–10) impressions are being recorded in Google Search Console (GSC).

  • Impressions dropped sharply across many sites.

  • Average position appears to improve — but only because deep-page data is no longer included.

discussing-analytical-data-at-meeting-resize.jpg

What is &num=100 Parameter?

In September 2025, Google disabled the &num=100 URL parameter that many AI bots and rank tracking tools used to fetch up to 100 search results in one query. This parameter allowed bots and tools to collect ranking and impression data beyond Google’s default 10 results per page.

Because of this change:

  • Many keywords that appeared in lower-page positions (page 3–10) are no longer being captured in Search Console impressions. 
  • The number of recorded impressions in GSC has dropped sharply for many sites. Search Engine Land 
  • At the same time, average position in GSC is showing apparent improvements, since those deep-page impressions (which dragged averages down) are disappearing from data sets. 

Reddit threads confirm this pattern:

“Impressions down (often sharply, especially on desktop) … average position up … ‘not losing clicks’. Many SEOs connect this to Google modifying &num=100” Reddit
“There is an update that Google seems to have fully removed support for the &num=100 parameter … Impressions are down and avg positions are improved in GSC” Reddit 

Why This Matters (Especially for GSC Reporting)

Because you rely on Google Search Console for reporting to your customers, you’ll likely see:

  • Big drops in impressions — but not necessarily drops in real traffic 
  • Better average positions — which might look like gains, but often are just a reporting artifact 
  • Fewer keywords showing, especially ones you ranked far down in the SERPs 

It’s important to understand: this isn’t a sign of a sudden SEO failure or penalty. It’s a measurement change.

What It Means for Real Visibility & Traffic

Most cases suggest clicks, conversions, and genuine user engagement have been much less affected. That’s because those deep-page impressions were rarely clicked by real users. 

So, while your impression counts may plunge, your actual traffic (as shown in Analytics or server logs) may remain stable or only shift slightly.

How to Explain This to Clients

  • “You’ll notice a sizable drop in impressions in GSC after ~Sep 10, but it doesn’t mean you lost real visibility. The way Google counts impressions just changed.” 
  • “The uptick in average position is largely statistical — because low-ranked, bot-inflated positions are no longer included in that metric.” 
  • “We’ll shift our focus to metrics that reflect real human engagement — clicks, conversions, pages that actually drive business.” 

What You Should Do Now (Next Steps)

  1. Don’t panic over impression drops.
    Instead, cross-reference with Analytics, GA4, or your own traffic/log data. 
  2. Recalibrate reporting KPIs.
    Focus more on clicks, CTR, conversions, and traffic quality than relying solely on impression counts. 
  3. Prioritize your top keyword positions.
    Keywords that rank on page 1 or 2 are now even more important—lower-ranking terms will often no longer appear in GSC at all. 
  4. Update explanations & predictions to clients.
    Be transparent about this change so they understand that impression drops may not mean performance loss. 
  5. Watch how tools adapt.
    Ranking tools will need to paginate (10 results per page) or raise pricing to recover data they lost. Expect delays, gaps, or missing data. 
  6. Monitor for long-term effects.
    Watch whether this causes consolidation: fewer long-tail impressions, more emphasis on high-intent competitive terms.

What does this mean for rank tracker reports?

Ways to mitigate or resolve the issue:

  1. Paginate via multiple API calls
    Instead of relying on a single num=100 request, make separate calls for page 1, page 2, etc. Combine results to recreate deeper SERP data. Many tools have already adopted this internally.

  2. Use SERP APIs built for pagination
    Switch to rank-tracking or SERP APIs that natively support multi-page scraping under their terms, so you don’t hit rate limits or bans.

  3. Prioritize tracking of page 1 / page 2 keywords
    Since results beyond page 2 rarely generate clicks, focus your reporting on keywords currently in those ranges. Instead of chasing page 5 appearances, allocate bandwidth to moving results into page 1–2.

  4. Supplement with click / impression data from GSC or GA4
    Use real user data as your ground truth. Even if rank trackers miss deep rankings, GSC (or Analytics) can validate whether you’re receiving traffic from those keywords.

  5. Re-calibrate reporting metrics
    Update dashboards and client reports to de-emphasize “average position” (which may now be artificially improved) and instead prioritize clicks, CTR, and actual ranking shifts on page 1 / 2.

  6. Monitor how tool providers respond
    Watch for updates from your rank tracker providers. Many will roll out new methods or price models to handle the deeper SERP reconstruction. Stay on a tool whose roadmap is aligned with evolving SERP access.

Over time, as SERP depth beyond page two becomes less significant and tool providers adjust, the impact will diminish. But in the short term, it demands that SEO teams rethink tracking strategy and rely more on hybrid data (rank + user metrics) to maintain accurate reporting and client expectations.

Recent Post