Companies racing to block artificial intelligence bots from scraping their websites may be undermining their own ability to control how customers see them online.
A new analysis by Hostinger, a no-code A.I. agent platform for building and growing online businesses, finds that A.I. assistant crawlers are rapidly expanding their reach across the web even as businesses aggressively restrict bots used to train A.I. models.
The study is based on 66.7 billion verified bot interactions across 5 million websites. Its conclusion is stark: A.I. assistants are reading and summarizing more business websites at the very moment companies are limiting their ability to shape how those systems interpret their content.
For decades, online commerce operated on a stable exchange. Search engines indexed websites and directed users back to them. Brands controlled pricing context, messaging and attribution within their own digital properties.
That model is now shifting. “With AI assistants increasingly answering questions directly, the web is shifting from a click-driven model to an agent-mediated one,” said Tomas Rasymas, Head of AI at Hostinger. “The real risk for businesses isn’t AI access itself, but losing control over how pricing, positioning, and value are presented when decisions are made.”
A.I. assistants increasingly answer questions directly, replacing clicks with summaries and recommendations. Discovery does not necessarily lead to a website visit. In many cases, the decision is shaped before the user ever lands on a brand’s page.
Hostinger’s data suggests that this transition is accelerating.
Over a five month period, OpenAI’s SearchBot expanded from 52 percent to 68 percent of websites, while Applebot doubled its presence from 17 percent to 34 percent. Traditional search crawlers remained broadly stable, indicating that A.I. systems are adding a new layer of decision making rather than replacing search outright.
Blocking AI, but not the right systems
At the same time, companies are sharply reducing access to model training crawlers.
According to the report, OpenAI’s GPTBot fell from 84 percent website coverage in August to just 12 percent by November. Meta’s ExternalAgent dropped from 60 percent to 41 percent during the same period.
Hostinger argues that many businesses are conflating two distinct types of bots.
Training crawlers collect data to improve A.I. models over time. Assistant crawlers, by contrast, retrieve content in real time to answer user questions. Blocking one does not necessarily prevent the other from summarizing products, ranking services or recommending brands.
In effect, companies may be limiting how A.I. systems learn from them while leaving intact the tools that actively mediate customer decisions.
Pricing, brand safety and attribution at risk
The implications stretch beyond traffic numbers.
As assistants summarize offers, the nuance of pricing structures can disappear. Messaging may be reframed outside approved brand guidelines. Paid advertising can lose visibility earlier in the customer journey. Ecommerce attribution becomes harder when transactions are influenced or completed within A.I. interfaces.
Hostinger’s report suggests that marketing teams may feel the impact first, but revenue forecasting, compliance oversight and operational planning could follow.
The shift reflects a broader structural change. Instead of competing primarily for search rankings and clicks, companies are increasingly competing for how A.I. systems interpret and present their information.
From blocking to governance
Some businesses are beginning to adjust their strategies.
Rather than blanket bans, they are moving toward what the report describes as selective A.I. governance. That approach allows assistant crawlers to access structured, authoritative content while restricting bots that pose intellectual property or cost risks.
Tools such as llms.txt, a machine-readable file designed to guide A.I. assistants toward preferred pages and priorities, are gaining attention. Companies are also experimenting with A.I.-ready interfaces that expose up-to-date structured content instead of leaving assistants to infer details from scattered web pages.
The underlying tension is clear. Companies want to protect proprietary content and manage costs tied to bot traffic. But in doing so, they may be surrendering influence over how A.I. systems frame their products and services.
Hostinger’s analysis, drawn from anonymized log entries collected during three six day windows in June, August and November 2025, classified verified crawler traffic using publicly documented user agents and observed behavior patterns. Human traffic was excluded.
The findings suggest that the commercial internet is entering a new phase. Search once determined visibility. Now assistants increasingly shape interpretation.
For brands accustomed to controlling their own narrative, the challenge is no longer just being found. It is being understood correctly by systems that increasingly stand between companies and their customers.




