3

My VPS is getting hammered with attempts on random files/directories that don't exist on my server, in the order of roughly twice every second. Presumably it's just a bot trying to figure out whether there's any security holes in scripts on my server, or whether my server can be used as a proxy. The log entries are as follows:

Combined (access) log:

- - - [02/Mar/2011:14:10:18 +0000] "GET http://ad.xtendmedia.com/st?ad_type=iframe&ad_size=728x90&section=1697270 HTTP/1.0" 403 204 "http://www.findthemovies.net/" "Mozilla/4.0 (Windows; MSIE 7.0; Windows NT 5.1; SV1; .NET CLR 2.0.50727)"

Error log:

[Wed Mar 02 14:10:18 2011] [error] [client 61.139.105.162] client denied by server configuration: /var/sites/***/public/st, referer: http://www.findthemovies.net/

The requests come from random referers for random IP addresses, so there's no specific blocking of IP's that I seem to be able to do. There aren't any scripts installed that they could abuse, and there's no proxying enabled on my server, but the main concern is that these constant requests are using up resources and slowing down genuine requests to my sites.

Is there anyway to block these requests, through being able to identify them, to stop them being processed by Apache and using up all the resources assigned to it? (I'm using Prefork, so obviously the volume of requests starts Apache processes firing up and uses up clients and resources).

WheresWardy
  • 41
  • 1
  • 5
  • Did you get an answer that worked for you? Just curious because my server is having the same type of (no evident IP) `- - -` GET requests, lately... – summea Mar 20 '12 at 20:31
  • The short solution was to write a quick script that processed the log files looking for the IP's that matched a pattern (I think URL's that didn't have my domain in) and send those to iptables to block - the requests would come from about 20 IP's and change every now and then. The long solution was that I eventually switched to nginx/php-fpm and haven't had a similar problem as nginx is so fast. – WheresWardy Mar 23 '12 at 09:14

3 Answers3

1

You could use a tool like denyhosts or fail2ban – they match patterns in the logfiles and execute commands based on those patterns. You could block with simple tcpwrappers (hosts.deny/allow), or fire off firewall rules at will.

kojiro
  • 559
  • 3
  • 8
  • 25
  • Thanks, I'll look in to those, my issue with tcpwrappers and firewall rules is that there didn't seem to be anything specific that I could pin down using those - my firewall is pretty tight, without just blocking access to port 80 completely! –  Mar 02 '11 at 14:21
  • Well, with fail2ban I usually have it deny based on IP for 24 hours, and then unset the deny. In cases where the traffic is accidental, sometimes sending the right response will cause the admins on the other side to notice the problem and fix it on their end. – kojiro Mar 02 '11 at 14:24
  • I've looked in to them both, and while they're quite useful, they mostly rely on 1) the reporting of errors in Apache logs and 2) placing large numbers of IP addresses in to hosts.deny or iptables. Obviously 2) doesn't scale very well, but ignoring that, with 1), there isn't always necessarily an error message in the Apache error log - like with a WordPress catch-all .htaccess for making permalinks work, which will just redirect any request to index.php, meaning it's very difficult to catch the IP's. There don't seem many solutions other than increasing server resources to cope. – WheresWardy Mar 03 '11 at 17:43
1

Or maybe you could use RewriteRules in your virtual host, that filters appropriate stuff and returns forbidden, or more funny, a final redirection to disneyland, which could annoy the bots (if they accept the redirection, they'll download disneyland and they'll be occupied for a while):

RewriteEngine On
RewriteCond [apache constant] [yourrewritecond] [OR]
RewriteCond [apache constant] [yourrewritecond] [OR]
RewriteCond [apache constant] [yourrewritecond] [OR]
RewriteCond [apache constant] [yourrewritecond] [OR]
...
RewriteCond [apache constant] [yourrewritecond]
RewriteRule (.*) http://disneyland.com$1 [QSA,R=301,L]
  • I've done something similar to this for now in that I scanned the logs of the domain that was being hit for the most common paths it was trying and they were (mostly) obscure, so I've set them as denied locations across all my domains and am now parsing the logs for accesses to those locations and banning IPs accordingly. Might look in to the HAProxy. Redirecting to somewhere heavy is certainly a good idea though! – WheresWardy Mar 04 '11 at 10:50
1

You could install HAProxy on port 80 of your DMZ interface (and have the backend point to Apache running on your LAN or localhost interface), and create a few ACL rules that tarpit the connections when they match (or don't match) certain path/header/domain criteria.

The benefit of a tarpit is that it will slow down bots, and prevent (most of) them from making consecutive calls.

Matt Beckman
  • 1,502
  • 18
  • 33
  • This seems like a good idea, though currently I've got nginx serving port 80 (for static files) that reverse proxies through to Apache, so it could get quite complicated! Is there anything that would make a bot proxy request like I seem to be getting stand out? Apart from the path, IP address and domain (which can all change or be unique), they all seem to be looking for the same thing - passing a complete URL as the path (so it's passing http://seattle.craigslist.org/see/boa/2245781192.html for example) to /st (for example) with a 301 status code, presumably hoping for a proxy redirect? – WheresWardy Mar 04 '11 at 10:56