2

I think this is a easy one.

I have a simple website with a reserved area and some files to download from it.

I disabled direct file download with .htaccess, and I manage the download of file via this simple proxy.php file:

// ....

if ($filename === null || !file_exists($proxiedDirectory.$filename)) {
    http_response_code(404);
    exit;
}

if (!$site->is_logged_in()) { 
    http_response_code(403);
    exit;
}

$fp = fopen($proxiedDirectory.$filename, 'rb');

header('Content-Description: File Transfer');
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename="'.$filename.'"');
header('Expires: 0');
header('Cache-Control: must-revalidate');
header('Pragma: public');
header('Content-Length: ' . filesize($filename));
fpassthru($fp);
exit;

This code WORKS perfectly, but unfortunately I have a 1.2Gb file that users could download, and this script is too slow, and doesn't allow the full file download.

Any help would be appreciated,

Thanks in advance!

M.

Mauro
  • 189
  • 2
  • 14

4 Answers4

1

Use chunked download, smth like this:

$chunkSize = 1024 * 1024;
while (!feof($fp))
{
    $buffer = fread($fp, $chunkSize);
    echo $buffer;
    ob_flush();
    flush();
}
Pavel Třupek
  • 898
  • 6
  • 19
1

You can use a combination of header(), set_time_limit(), fgets(), ob_flush(), flush(). Here is my example, best using Php 64bit on OS 64bit because filesize() has not limits in this architecture.

    // Set force download headers
    header('Content-Description: File Transfer');
    header('Content-Type: application/octet-stream');
    header('Content-Disposition: attachment; filename="' . $file . '"');
    header('Content-Transfer-Encoding: binary');
    header('Connection: Keep-Alive');
    header('Expires: 0');
    header('Cache-Control: must-revalidate, post-check=0, pre-check=0');
    header('Pragma: public');
    header('Content-Length: ' . sprintf("%u", filesize($downloads_folder . $file)));
    // Open and output file contents
    set_time_limit(0);
    $fh = fopen($downloads_folder . $file, "rb");
    while (!feof($fh)) {
      echo fgets($fh);
      ob_flush();
      flush();
    }
    fclose($fh);
    exit;

Hope this helps.

Alessandro
  • 900
  • 12
  • 23
0

use nginx as proxy may be more suitable. The reason is simple : when you download by php, there will be timeout when request in php-fpm, so the file more likely can not download complete.

location /proxy.lua {
    proxy_pass http://proxy.com/link/;
} 

if you need to check user has login or not, you can use lua + nginx(OpenResty)

yet there is a simple way to check that: 1. proxy.php redirect the request to nginx location proxy.lua with two parameter: ts and code.

ts is timestamp of server
code is md5sum(ts+"complexstring")
header("Location: /proxy.lua?ts=122&code=xxx&filename=xxxx", 302);
exit(0);

when in lua:

parse the parameter ts and code, validate the code and ts value
proxy the file
ryan
  • 94
  • 7
  • Thanks ryan, your answer is correct, but unfortunately I have a shared apache host :( – Mauro Mar 13 '19 at 13:34
  • you can redirect the request to some host which is nginx or openresty. When proxy a large file or the client network is not fast enough, timeout will be occurred definitely. – ryan Apr 03 '19 at 03:18
0

Use ob_end_clean() before fpassthru($fp)

e.g.

ob_end_clean();
fpassthru($fp);

This might work for large files

Gaurav Gandhi
  • 3,041
  • 2
  • 27
  • 40
SachinPatil4991
  • 774
  • 6
  • 13