This code is trapping crawlers from OpenAI, Claude and Meta in an endless maze, when they don't follow web scraping rules. Think of a website like a store - robots.txt is the sign that says which areas are off limits. For years, AI companies agreed to follow these rules, until they didn't. Anthropic and Meta's crawlers went overboard, hitting individual websites millions of times a day, consuming significant server resources. So an unknown developer built Nepenthes - named after a carnivorous plant. When rule-breaking crawlers enters, it gets lost in an endless maze of fake pages, consuming gibberish data for months. Only OpenAI's crawler was able to escape, and that too after months. The tool is going viral among frustrated developers, helping them fight back against big tech companies trying to steal their data.
#shorts #tech #ai #cybersecurity #programming #software #hacking #technology #developer #coding
#shorts #tech #ai #cybersecurity #programming #software #hacking #technology #developer #coding
- Category
- Artificial Intelligence & Business
Comments