As many who posted here, my apache server is getting brutally hammerred to near death by robots, most of which are good robots.
There is no way to change their crawl rate.
Anyone have a suggestion as to how to solve this problem? I thought perhaps of creating 2 groups, called users and bot and allocating 50% for each.
Then from the behaviour or user-agent, if something is identified as a bot, it will be set to the bots group whose members have a maximum of 50% total resources. If bots try to take more, the system will slow down proportiannly so they never have more than 50%.
Does anyone know how to go about doing this or some other method with the same goal? I am using centos and have very little experience in this.