[AI Summary]: AI scrapers are causing severe infrastructure issues for FOSS projects through aggressive crawling that ignores robots.txt, uses spoofed user agents, and operates from tens of thousands of IPs to evade detection. Projects like SourceHut, KDE GitLab, GNOME, LWN, and Fedora report that 70-97% of their traffic comes from AI bots, causing regular outages, forcing drastic measures like blocking entire countries (Brazil for Fedora), implementing proof-of-work challenges that delay legitimate users, and costing projects thousands of dollars monthly in bandwidth. Additionally, AI-generated bug reports are wasting maintainer time with hallucinated security vulnerabilities, creating a double burden on already resource-constrained open source communities who must keep their infrastructure public for collaboration while commercial projects can operate behind closed doors.
- Author: Niccolò Venerandi
- Published: March 20, 2025
- Source: LibreNews