4

I'm being destroyed by robots and malicious users trying to take down my site.

I tried making a bandwidth limiter by mod_rewriting all requests for mp3 files to a php script which would track each ip and limit how much each can download per hour. This worked fine but it created alot of problems for some users that the links were not direct links to the mp3 files.

I tried all sorts of force download headers but what worked for some users would not work for others.

Now, the question is, is it possible to keep direct links to mp3 files and whenever someone clicks on an mp3 file to simultaneously run a tracking php script which would allow or deny the request?

thanks!

(server does not have mod_bandwidth or any other useful mod)

  • Host a torrent tracker and torrent them =o) free bandwidth then! Just kidding, to stop the robots setup a CAPTCHA and have the link with hash ID be provided on success, which will then serve the MP3. As for the malicious users, what are they doing? A CAPTCHA should slow them down as well, but would need more info. – kittycat Dec 17 '12 at 07:53
  • thnx. but how do you serve the mp3's. the only thing that makes everyone happy is direct links. If i server using readfile I get complaints from people who cannot download. tried all sorts of http headers but no cigar. – user1909257 Dec 17 '12 at 07:57
  • Can use setup a dedicated subdomain for mp3 files? If so, you then can handle this by `Host` header differently then all other requests. Specifically you can place your existing throttling php-script as index.php in that subdomain, so links will remain "direct", but throttling will be transparently enabled. – Stan Dec 17 '12 at 07:59
  • thnx sounds complicated. do you know a webpage that talks about this? – user1909257 Dec 17 '12 at 08:27
  • 1
    @user1909257 Can you please elaborate on the header combinations you have tried? There is no reason that any sensible UA should have a problem with this setup if the headers are set right. – DaveRandom Dec 17 '12 at 10:13

2 Answers2

1

instead of simultaneously tracking the usage as a user clicks on a link, why not make the landing page a PHP script that checks the ip of the user and based on that outputs the links or hides them? if your files are store in an obvious folder, e.g. yoursite/music/artist/song.mp3 you can create soft links for them, that way it obfuscates the path. on linux server you can use symlink() php function to create a soft link to the /music directory and output the path as yoursite/3awE2afeefef4a323/artist/song.mp3 which should still be a direct link to the file, e.g.:

<?php
$target = 'music/';
$link = 'a322ewrr323211';
symlink($target, $link);
?>
<html>
<body>
     <a href="a322ewrr323211/artist/song.mp3">link</a>
</body>
</html>

then periodically delete these symlinks every night or after 24 hours of creation.

ierdna
  • 5,753
  • 7
  • 50
  • 84
1

A mod_rewrite solution would be elegant enough and it's possible to keep the links direct.

In your .htaccess, rewrite all .mp3 links to a PHP script:

RewriteEngine on
RewriteRule \.mp3$ download.php

Inside the PHP file, we can extract the requested URI, validate the user's IP and return the appropriate headers based on that validation.

<?php

// Set variables
$requestedFile = trim($_SERVER['REQUEST_URI'], '/');
$ip = $_SERVER['REMOTE_ADDR'];
$protocol = isset($_SERVER['SERVER_PROTOCOL']) ? $_SERVER['SERVER_PROTOCOL'] : 'HTTP/1.0';

// If the file does not exist, throw a 404 error
if (!file_exists($requestedFile)) {
    header($protocol . ' 404 Not Found');
}

// Is the user allowed to download?
function validateDownload($ip) {
    /**
     * Put your IP tracking and validation code here, returning `TRUE`
     * or `FALSE`.
     */

    return TRUE;
}

// Validate and perform appropriate action
$canDownload = validateDownload($ip);

if ($canDownload) {
    header('Content-Disposition: attachment; filename="' . basename($requestedFile) . '"');
    header('Content-type: audio/mpeg');
    readfile($requestedFile);
} else {
    header($protocol . ' 403 Forbidden');
}

Now all links remain direct and you're returning the appropriate headers to the user agent, prompting the download or denying access.

amustill
  • 5,274
  • 23
  • 15
  • thanks but the headers and readfile does not work for some people. this is why i wanted to keep direct links – user1909257 Dec 18 '12 at 19:13
  • @user1909257 If this technique is not working for some users then it's likely that they're not using a browser to download the files, but rather some form of download manager. PHP headers can be a little buggy, especially in older versions of IE (6 and 7), but adding some additional headers (detailed at #3 here: http://www.richnetapps.com/the-right-way-to-handle-file-downloads-in-php/) usually resolves this. All other browsers have no problem understanding these headers when sent correctly, as in my example. – amustill Dec 19 '12 at 10:13