4

This morning one of our sites got attacked by a bot that was looking for vulnerabilities. The same IP address was used. However the BOT only made on average 12-16 page requests a minute. Different pages were hit that have 20-40 static resources made up of images, css, js etc.

Armed with this knowledge what is a good strategy in enabling Dynamic IP Restrictions on IIS7.5? I can see I can enable "logging only mode" but am not entirely sure how to best look at the log files to solve this problem.

What I don't want to do is to lock out my users but abort the request for BOTs.

John Gardeniers
  • 27,458
  • 12
  • 55
  • 109
Rippo
  • 169
  • 4
  • 13
  • 2
    Why is it a problem? If you don't have vulnerabilities, it's not a problem. If you do have vulnerabilities, then those vulnerabilities are your problem, not someone scanning for them. – Mike Scott Aug 10 '12 at 10:44
  • Still, doubling up on security would make me feel a whole lot better. Unless that is adding IP restrictions proves to be troublesome for normal website users. – Rippo Aug 10 '12 at 10:53
  • Plus the OS or IIS7.5 might have vulnerabilities that have not yet been patched. – Rippo Aug 10 '12 at 10:54
  • 2
    Don't waste your time. Most likely that same machine will have a different IP address tomorrow. – John Gardeniers Aug 10 '12 at 12:27
  • If its really that much of a concern you should be looking into load balancing and intrusion prevention devices that are designed to find these types of things and stop them. After all I send ServerFault 12-16 page requests per minute too! – Brent Pabst Aug 10 '12 at 17:52

1 Answers1

7

We've recently tried to set up the Dynamic IP Restrictions module for one of our larger sites and it's not been easy. Reading the comments on the original question I had to smile broadly when I saw "Don't waste your time.". It's mostly true.

I'll still give a few hints, though, what you've got to look out for:

  • You might have to separate your page calls from your static assets calls, i.e. move the latter to their own domain, to apply the restriction you'll set up only to page loads not static assets.
  • You'll need to define and test your request denial criteria carefully - is it something like 10 req / 1s or 50 req / 10s? 12 req / 60 sec is most likely not a good denial criteria because it might affect a lot of legitimate users.
  • Using the "Logging Only Mode" checkbox will log pseudo-denied requests to the IIS log, but with a status code of 200 and a substatus code of 502 (so make you sure you're logging that). In addition, the Advanced Logging module will not log those requests correctly so forget about using that if you want to use Dynamic IP Restrictions and want to stay informed about denied requests.
  • You have to continuously watch your logs and might eventually have to whitelist some external proxies' IP addresses so that users behind them will be treated correctly as separate users and not as a single user (with the IP of the proxy).
  • You cannot penalize clients for some amount of time after hitting your request limits - as soon as they throttle their requests pass through again.

We started trying this module out over a month ago and still haven't switched the "Logging Only Mode" off because there's just so much that doesn't feel or work right...

Oliver
  • 311
  • 1
  • 5
  • 13