The author of the post is not listed
Metihan Eshilyurt запустил an AI-generated website with 60,000 pages for $10 and began monitoring bot visits.
The logs show the current level of the OpenAI crawler.
In the first 12 hours, GPTBot made over 29,000 requests.
Googlebot — 11.
For context: this is about 470 times more aggressive crawling intensity on a fresh domain with zero backlinks, zero social signals, and no submission to the console.
GPTBot found the site through the XML sitemap and simply started consuming everything at a rate of about 1 request per second.
The data reveals what early infrastructure looks like before optimization mechanisms are implemented.
Google spent 25 years solving the problem of crawling budget distribution — understanding which pages deserve attention, how often to revisit them, and how to do all this without unnecessarily DDoSing servers.
For them, this is a long-closed question.
OpenAI does not have this yet.
The behavior of the crawler that Metihan observed is a result of having computational power and ambition, but not yet having built the layer of efficiency that comes from working at Google’s scale for decades.
This resembles early advertising auction systems, where first-generation bidders simply flooded the problem with resources, and complexity came later when the economy forced changes.
GPTBot right now is equivalent to a bidder with an unlimited budget and no bid cap; effective in reach but terrible in optimization.
The main practical point: if you haven’t explicitly blocked GPTBot in robots.txt, it crawls your site just the same.
Most website owners are not even aware of this because client analytics miss it all.
Deploy server-side tracking with user agent parsing to simply see what is happening.
A recent study has shown that the implementation of AI increases the time employees spend on various tasks, but it also leads to burnout and errors. Learn more about the impact of AI on productivity.
Discover ten powerful SEO exploits that will help you improve traffic and rankings in search engines. Don't miss the opportunities, act quickly!
The article explores atomic chunking of content and its impact on vector search, including experiments and LLM architectures that confirm the effectiveness of this approach.
The article discusses the importance of a strict subfolder hierarchy for SEO, emphasizing how it helps manage algorithmic risks and improves page crawling.
Explore the multi-layered image optimization process for local businesses, which includes creating a base image, overlaying key information, and optimizing metadata.
No articles by the author were found
AffGate.com is an independent analytical platform for iGaming, SEO, and digital marketing.
We collect data from official sources, structure information about markets, companies and technologies, and make the industry more transparent and understandable for professionals.
AffGate.com is not an online casino and does not provide access to gambling. All information is provided for educational and analytical purposes only.
© 2024-2026 AffGate.com.