Demo

This summer’s plunge in news traffic was not just about Google’s AI Overviews rewriting how people find information. It was a perfect storm: Google cutting clicks at the front end, and publishers, through Cloudflare’s default bot blocking, cutting off the back-end pathways that help stories “go legs.” It is the digital equivalent of my dad refusing mobiles for years, fear keeping you isolated just when connection matters most.
Publishers want to blame Google alone. The reality is harsher. Many are hurting themselves by blocking the very crawlers that carry their work into the wider web. Case studies show the cost. One tech site lost half its traffic overnight from a robots.txt error. Recipe sites vanished from search after misconfigured responses. Even e-commerce giants have crashed rankings from accidental bot blocks. Meanwhile, industry data shows publishers losing 1 to 25 percent of Google referrals in just eight weeks as AI Overviews rolled out. Together, that is a double hit: fewer front-end clicks, fewer back-end routes for discovery.
The devastation is real, budgets evaporating, stories buried before they spread. And the truth is painful. Much of this damage is self-inflicted.
The Numbers Do Not Lie
Google’s new AI layer is swallowing clicks. Research shows a 34 percent drop in click-through when an Overview appears. Pew found just 8 percent of users who saw an AI summary clicked any link, compared with 15 percent for traditional search. Only 1 percent clicked the source links inside the summaries. Sessions are ending faster too, 26 percent terminate after an AI Overview compared with 16 percent without.
Press Gazette reported that MailOnline saw its click-through rates fall by as much as 56 percent when AI Overviews appeared. That single figure should terrify every publisher.
Cloudflare tells the other half of the story. In June 2025, Claude’s crawl-to-referral ratio stood at 70,900 to 1. OpenAI’s was 1,700 to 1. Even Google now crawls 14 times for every referral it sends. Publishers understandably resent this imbalance. But blanket blocking does not just punish AI, it cuts off the propagation that helps stories move.
And so the question becomes: if we cannot win on clicks, can we at least get paid for the crawls themselves?
Will Pay-Per-Crawl Save Us?
Cloudflare’s Pay-Per-Crawl promises to turn scraping into revenue, charging bots a flat fee, sometimes as low as $0.001 per crawl. For B2B publishers, that is not transformative income. Worse, most bots can mimic human traffic and slip through undetected, while some of the crawlers being blocked may actually send useful visitors.
Now compare that to what publishers pay to attract traffic. Google Ads clicks cost 3 to 4 dollars on average, and often more for business and finance keywords. In other words, the industry pays Google thousands of times more to deliver a reader than it would ever earn from refusing one through Pay-Per-Crawl.
That imbalance highlights the core problem. Pay-Per-Crawl might claw back pennies, but it will not restore the economics of journalism. The real value comes from readers, people who click, share, subscribe, and let stories gain influence. Blocking that discovery for fractions of a cent is a false victory.
Journalism’s Real Mission
Stories are not meant to sit still. A great investigation matters because it spreads, picked up, validated, expanded, inspiring other stories and wider impact. That is the press at its best. Yet many publishers are retreating into digital fortresses, cutting off the very mechanisms that give journalism influence.
Of course AI firms should pay fairly, and Cloudflare’s Pay-Per-Crawl may be one step towards that. But shutting the door entirely is self-defeating. My dad once refused to buy a mobile, convinced it was a fad. For a decade he missed out. Now he is addicted to WhatsApp and Facebook. Publishers cannot afford the same mistake.
Two Choices
The instinct in hard times is either to block and shrivel, or to grow and thrive. Blocking bots may give momentary satisfaction, the machine cannot read my words, but at the cost of readership, influence, and future revenue. Growth demands a more nuanced approach:
• Block bad actors, but keep pathways open for legitimate discovery.
• Build direct audience channels while still engaging with search and social.
• Push for fair compensation, but do not confuse protection with proliferation.
The bottom line: good journalism needs to proliferate to matter. A story locked away, whether behind a paywall or a robots.txt barricade, cannot change anything. The perfect storm of AI Overviews and bot blocking is real. But it is survivable if the industry remembers its purpose, not just to publish, but to spread.
At Tomorrow’s Publishers we welcome bots. We have published, you have read it, and we do not begrudge anyone taking our news and making the most of it wherever they want. This is a new age of hyper-intelligence, and we want to be at the forefront of it, not throwing the machines into the canal.

Supercharge Your Content Strategy

Feel free to test this content on your social media sites to see whether it works for your community.

Get a personalized demo from Engage365 today.

Share.

Get in Touch

Looking for tailored content like this?
Whether you’re targeting a local audience or scaling content production with AI, our team can deliver high-quality, automated news and articles designed to match your goals. Get in touch to explore how we can help.

Or schedule a meeting here.

© 2025 NewsCaaSLab. All Rights Reserved.