Open Source Developers Fight Back Against AI Crawlers

- AI web-crawling bots are ignoring robot.txt files and causing DDoS outages
- Open source developers are fighting back with clever tools like Anubis and Nepenthes
- Anubis blocks bots but lets through human-operated browsers
- Nepenthes traps crawlers in an endless maze of fake content
- Cloudflare has released a tool called AI Labyrinth to slow down and confuse AI crawlers
The Problem of AI Crawlers
AI web-crawling bots are causing problems for open source developers, ignoring robot.txt files and taking down websites with DDoS outages. Developers are fighting back with clever tools and humorous approaches.
One developer, Xe Iaso, described how AmazonBot relentlessly pounded on a Git server website, causing outages and ignoring the robot.txt file. Iaso built a tool called Anubis, a reverse proxy proof-of-work check that blocks bots but lets through human-operated browsers.
Fighting Back with Cleverness
Anubis has spread quickly among the open source community, collecting 2,000 stars and 20 contributors on GitHub. Other developers are using similar approaches, like Nepenthes, which traps crawlers in an endless maze of fake content.
Cloudflare has also released a tool called AI Labyrinth, which slows down and confuses AI crawlers that don't respect 'no crawl' directives. The tool feeds misbehaving AI crawlers irrelevant content rather than extracting legitimate website data.
A Call to Action
Some developers are calling for a more direct fix, asking people to stop using AI tools and generators. However, with the likelihood of that being low, developers are continuing to fight back with cleverness and humor.