The SEO Ranking Mystery That Had Us Scratching Our Heads (And The Answer Will Surprise You)
Published by Amir Latif
As an SEO consultant, there’s nothing more frustrating than watching a client’s rankings seemingly plummet overnight with no clear explanation. That’s exactly what happened to us in early September, and what followed was weeks of head-scratching, deep-diving into technical audits, and countless sleepless nights trying to figure out what we’d done wrong.
Spoiler alert: We hadn’t done anything wrong. Neither had our client. The culprit? Google itself.
When Client Panic Meets SEO Confusion
It started innocuously enough. One of our long-term clients reached out in mid-September, concerned about dramatic drops in their Search Console impressions. We’re talking about a 200,000+ daily impression drop on desktop traffic. Their average position metrics were getting worse, their rank tracking tools were showing gaps and missing data, and understandably, they were panicking.
My first instinct was to dive into the usual suspects:
- Had we made any recent changes to the site?
- Were there any technical issues we’d missed?
- Had Google rolled out a major algorithm update?
- Were there any manual actions or penalties?
Everything checked out clean. The website was healthy, no recent changes had been made, and other clients weren’t showing similar patterns. That’s when things got really puzzling.
The Desktop vs Mobile Mystery
What made this case particularly baffling was the pattern we were seeing. Desktop impressions were crashing hard, but mobile impressions remained relatively stable. In today’s mobile-first world, seeing desktop-specific issues felt backwards.
We ran through every possible technical explanation:
- Mobile-first indexing problems? Nope.
- Desktop-specific technical issues? Nothing found.
- Different content serving between devices? All looked identical.
- Core Web Vitals discrepancies? Desktop was actually performing better.
The more we investigated, the more confused we became. Our rank tracking tools were showing inconsistent data, some rankings appeared to be missing entirely, and the client’s patience was understandably wearing thin.
The Lightbulb Moment
After weeks of investigation, I stumbled across discussions in SEO communities about similar issues. That’s when I discovered what had actually happened: Google had quietly eliminated a critical search parameter that the entire SEO industry relied on.
On September 10, 2025, Google removed the &num=100
URL parameter that allowed rank tracking tools to display 100 search results per page instead of the standard 10. This seemingly small technical change had massive implications:
- SEO tools broke overnight – Most rank tracking platforms suddenly needed 10 separate requests instead of 1 to gather the same data
- Costs increased 10x – Many tools simply couldn’t afford to continue tracking at the same level
- Data gaps appeared – Some platforms stopped updating entirely or showed incomplete results
But here’s the kicker that really blew my mind…
The Bot Traffic Revelation
The dramatic drop in Search Console impressions wasn’t actually a performance problem – it was the removal of artificial bot impressions that had been inflating metrics for years.
Here’s what was happening: When SEO tools used the &num=100
parameter, they created artificial “search result pages” showing 100 results. This meant websites ranking at position 99 would register impressions in Search Console, even though real users would never scroll that far or see those results.
Our client’s “traffic drop” was actually Google removing fake bot impressions and showing us real user behavior for the first time in years.
The desktop focus made sense too – most automated rank tracking was happening on desktop systems, so when that bot traffic disappeared, desktop metrics took the hit while mobile (which had less bot pollution) remained stable.
Industry-Wide Chaos
Once I understood what was happening, I started seeing the broader picture. This wasn’t just affecting our client – it was an industry-wide crisis:
- Semrush’s Sensor tool stopped updating on September 10
- Multiple tracking platforms began showing error states and missing data
- Some tools announced they would no longer track Top 100 results due to cost constraints
- seoClarity was one of the few that had prepared for this change and remained unaffected
The SEO community had been unknowingly dependent on inflated metrics for years. What we thought were real performance indicators were partially artificial.
What This Means for SEO Professionals
This revelation has forced me to completely rethink how we measure and report SEO performance:
1. Historical Data is Questionable
Any Search Console impression data before September 10, 2025, may have been artificially inflated. We need to establish new baselines.
2. Focus on Quality Metrics
Click-through rates and actual conversions are more valuable than impression volumes that might include bot traffic.
3. Tool Selection Matters
SEO tools that have successfully adapted to Google’s changes are worth their weight in gold. Those that haven’t may become obsolete.
4. Client Communication is Critical
We need to explain to clients that apparent “drops” starting in mid-September likely represent measurement corrections, not performance problems.
The Silver Lining
While this situation was initially terrifying, it’s actually revealed something positive: the data we’re seeing now is more accurate than what we’ve had in years.
We’re finally seeing true user behavior without bot pollution. This means our optimization efforts can be based on real user interactions rather than artificially inflated metrics.
What We’re Doing Moving Forward
Based on this experience, we’re making several changes to our SEO practices:
- Diversifying data sources beyond just automated rank tracking
- Focusing more heavily on conversion data and actual business impact
- Manual verification of critical keyword rankings
- Client education about the differences between bot-inflated and real metrics
- Quarterly tool audits to ensure our measurement stack remains reliable
Have You Experienced This Too?
I’m curious about the broader SEO community’s experience with this situation. Have you noticed similar patterns with your clients? Are you seeing:
- Dramatic desktop impression drops starting around September 10?
- Rank tracking tools showing gaps or missing data?
- Clients panicking about apparent performance drops?
- Tools implementing usage restrictions or pricing changes?
I’d love to hear your stories in the comments below.
Did you figure out what was happening before I did? Have you found effective ways to explain this to clients? Are there specific tools or strategies that have helped you navigate this transition?
The Takeaway
This experience taught me a valuable lesson about the SEO industry: we’re often more dependent on third-party infrastructure than we realize. When Google makes changes – even seemingly minor technical ones – it can ripple through our entire measurement ecosystem.
The key is staying curious, questioning anomalies, and not automatically assuming that dramatic data changes mean we’ve done something wrong. Sometimes, the answer lies not in our SEO strategies but in the platforms we use to measure their success.
What’s your biggest SEO mystery that turned out to have a simple explanation? Share it in the comments – I’d love to hear your detective stories!
Leave a Reply