Open source developers are fighting back against AI web crawlers with creative and often humorous tactics. Many AI bots ignore robots txt, causing severe strain on servers, especially for free and open source software projects that lack the resources to counteract them. Some developers have reported relentless bot activity leading to downtime and service disruptions.
One notable response comes from Xe Iaso, who developed Anubis, a proof of work system that filters out AI crawlers while allowing human users through. Named after the Egyptian god of judgment, Anubis blocks bots that fail the challenge, while real users are welcomed with an anime inspired image. The project gained rapid popularity, with thousands of GitHub stars within days.
Other developers have taken even more aggressive measures. Some open source maintainers, overwhelmed by AI scrapers, have resorted to blocking entire countries to keep their projects operational. Others have built trap systems like Nepenthes, which misleads bots by feeding them endless loops of fake content, effectively wasting their resources.
Cloudflare, a major player in online security, has introduced its own solution, AI Labyrinth, which confuses and slows down bots that ignore crawling restrictions. These strategies reflect a growing frustration among developers, many of whom see AI scrapers as unethical data harvesters.
Despite these defenses, some in the open source community argue for a broader cultural shift, urging developers to stop supporting AI models that rely on aggressive web scraping. However, with AI development showing no signs of slowing down, open source contributors are left to fight back with ingenuity and a sense of humor.