I am configuring Fail2Ban on my Ubuntu web server to prevent it from being a victim of DoS / DDoS. I don't want to use Cloudflare because I have to route my DNS over and use their SSl cert.
Currently, I found a script online that checks for more than 1 HTTP HEAD
request per second, or more than 1 request to xmlrpc.php
per second. I don't think it's sufficient protection, as these aren't the only kinds of requests that people can employ to execute a DDoS attack.
I'm looking at restricting the number of GET
/ POST
requests a given IP can make in a short window, but I'm not sure how I should set the restriction, since big pages that load a lot of Javascript, CSS or images will make a lot of GET
requests in a short amount of time. Should I be looking at limiting GET
/ POST
requests, or should I be looking at something else? Why?