Comment Re:Umm... robots.txt? (Score 1) 85
Its when you get hit with a botnet of over a million unique IPs that has been rented from some malware provider to crawl and slurp your site down as fast as possible. When your site goes from 4-5 requests per second to 1000s. All with randomized user agents, all coming from different residential subnets in different parts of the world. And then it goes on for weeks on end. Even when you manage to block it, it doesn't stop the traffic. They keep trying, and then they keep iterating to find new ways around your block.