0

I've setup php to authorize download of file which works

header("Content-Disposition: attachment; filename=\"" . basename($_GET[l]) . "\"");
header("Content-Type: application/force-download");
header("Content-Length: " . filesize($_GET[l]));
header("Connection: close");
ob_end_flush();
readfile($_GET[l]);

My problem now is that i would want to limit the amount of concurrent readfile downloads per ip, is there a way to do this? I am also looking for a way to limit the bandwidth of sendfile, i've tested using the built in apache module called "ratelimit" which works if apache is sending the file but not if i am using php.

Any ideas?

EDIT: Absolutely insecure, read comments below for tips on another method!

karnehe
  • 325
  • 2
  • 12
  • 1
    `readfile($_GET[l]);` is a huge security risk! – Lawrence Cherone Nov 27 '17 at 07:44
  • could you elaborate? – karnehe Nov 27 '17 at 07:54
  • One could enter `?l=index.php` or worse and download your source code or configs etc. https://www.owasp.org/index.php/Testing_for_Local_File_Inclusion – Lawrence Cherone Nov 27 '17 at 07:58
  • wow, ok... will do a new approach...whats the best way to securing sending of files? – karnehe Nov 27 '17 at 08:14
  • How I would do it is move the "downloads" out of the web-root, then have some code to enumerate the directory and store the information in a database, then when you want to serve them, use a dynamic id to reference them (base62[file_id, user_id]), check the user is allowed to access the file, then serve the file though php with fopen. this will then allow you to "rate" limit as you send the file to the user in chunks with a usleep in between. Have a look at how [FatFree](https://github.com/bcosca/fatfree/blob/e9aec879272c7474294c51c6c00802615afa32a6/lib/web.php#L130) does it. – Lawrence Cherone Nov 27 '17 at 08:21

0 Answers0