The internet is undergoing a massive transformation. Over the past year, global internet traffic grew by 19 percent, driven heavily by a rapid increase in AI bot traffic. As artificial intelligence models become more integrated into daily workflows, automated bots and scrapers are steadily closing the gap with human internet usage.
Recent industry reports reveal that human web visits fell by approximately 5 percent between the third and fourth quarters of 2025, while AI-related web activity skyrocketed. The digital landscape is also facing an escalation in cyber threats, with record-breaking distributed denial of service attacks and a shift in the primary targets of malicious actors. From the dominance of search engine crawlers to emerging challenges for website publishers, the internet is fundamentally rewiring itself.
The Rapid Growth of Automated Crawlers
Automated programs that scan the web for data are taking over a significant portion of network activity. Data shows that traditional non-AI bots and human users are now nearly tied in generating internet requests. As of late 2025, humans made up 47 percent of requests, while traditional bots accounted for 44 percent. Specialized AI bots now account for an average of 4.2 percent of all web requests.
Google’s crawling bot remains the dominant automated actor, reaching 11.6 percent of unique web pages and generating slightly more traffic than all other leading AI bots combined. Its dual-purpose nature—crawling for both traditional search indexing and AI model training—gives it a massive digital footprint.
Other platforms also experienced explosive growth. User-action crawling, which occurs when bots visit websites to answer direct questions from human users, increased by more than 15 times over the course of the year. Highlighting this rapid shift, the ratio of new AI bot visits compared to human visits jumped from one in 200 at the start of the year to one in 31 by the end of 2025.
Website Publishers Face New Challenges
The explosion of AI bot traffic is creating severe headaches for website owners. A major issue is the crawl-to-refer ratio, which measures how often an AI platform scans a site compared to how often it actually sends a user back to that source. Anthropic recorded the highest ratios, scraping sites up to 100,000 times for every one user it referred. OpenAI also showed high ratios, while traditional Google search maintained a much lower, more balanced rate of sending traffic back to publishers.
Publishers are currently struggling with dwindling click-through rates. Websites without direct AI licensing deals saw their click-through rates drop by roughly three times, largely because more than a third of active AI users now begin their searches with an artificial intelligence assistant instead of a traditional search engine.
Administrators attempting to block these bots face an uphill battle. While site owners try to use standard instruction files to keep automated scrapers out, analysis shows these rules are ignored around 30 percent of the time. Additionally, blocking Google’s AI crawler is risky, as it forces publishers to choose between protecting their content from AI training and maintaining their visibility in traditional search results.
Shifting Cyber Attacks and Outages
As the volume of traffic grows, so does the scale of cyber warfare. In 2025, security networks blocked more than 25 massive cyber incidents that surpassed all previous traffic peaks.
There is also a notable shift in who is being targeted. For the first time, civil society and non-profit organizations became the most attacked sector on the internet. At its peak, this group faced over 23 percent of all mitigated cyber attacks, likely due to the highly sensitive and financially valuable data they hold on donors and volunteers. Meanwhile, attacks on the gambling and gaming industry dropped by more than half.
The causes of internet disruptions also evolved over the year. Nearly half of all major global internet outages were caused by government interventions, such as deliberate network shutdowns or content restrictions. Conversely, outages caused by severed physical cables decreased by almost 50 percent.
Internet Quality and Encryption Milestones
Despite the challenges of rising bot activity and escalating attacks, the internet achieved important technical milestones. Post-quantum encryption, a technology designed to protect data against future advanced computers, now secures 52 percent of all human web traffic. This marks one of the first large-scale deployments of next-generation cryptography.
In terms of performance, European countries continue to lead the world. Spain ranked first globally for overall internet quality, boasting high-speed connections exceeding 200 to 300 megabits per second. This infrastructure growth highlights that global networks continue to improve even as the fundamental nature of web traffic changes.
