4

I'm a novice, so I'll try and do my best to explain a problem I'm having. I apologize in advance if there's something I left out or is unclear.

I'm serving an 81MB zip file outside my root directory to people who are validated beforehand. I've been getting reports of corrupted downloads or an inability to complete the download. I've verified this happening on my machine if I simulate a slow connection.

I'm on shared hosting running Apache-Coyote/1.1.

I get a network timeout error. I think my host might be doing killing the downloads if they take too long, but they haven't verified either way.

I thought I was maybe running into a memory limit or time limit, so my host installed the apache module XSendFile. My headers in the file that handles the download after validation are being set this way:

<?php
set_time_limit(0);
$file = '/absolute/path/to/myzip/myzip.zip';

header("X-Sendfile: $file");
header("Content-type: application/zip");
header('Content-Disposition: attachment; filename="' . basename($file) . '"');

Any help or suggestions would be appreciated. Thanks!

stringerbell
  • 269
  • 2
  • 7
  • Have you tried sending the file using `Content-Type: application/octet-stream` instead of "zip"? I would also suggest reviewing https://tn123.org/mod_xsendfile/ and making sure / verifying with your host that the settings are correct. – Jim Sep 13 '12 at 17:59
  • It turns out that my shared hosting environment was the problem. My host wasn't able to properly install XSendFile and was killing it after 10 minutes, thus people with slow connections were getting partial downloads/network timeouts. Moving to a VPS solved this problem. – stringerbell Sep 17 '12 at 21:22

1 Answers1

2

I would suggest taking a look at this comment:

http://www.php.net/manual/en/function.readfile.php#99406

Particularly, if you are using apache. If not the code in the link above should be helpful:

I started running into trouble when I had really large files being sent to clients with really slow download speeds. In those cases, the script would time out and the download would terminate with an incomplete file. I am dead-set against disabling script timeouts - any time that is the solution to a programming problem, you are doing something wrong - so I attempted to scale the timeout based on the size of the file. That ultimately failed though because it was impossible to predict the speed at which the end user would be downloading the file at, so it was really just a best guess so inevitably we still get reports of script timeouts.

Then I stumbled across a fantastic Apache module called mod_xsendfile ( https://tn123.org/mod_xsendfile/ (binaries) or https://github.com/nmaier/mod_xsendfile (source)). This module basically monitors the output buffer for the presence of special headers, and when it finds them it triggers apache to send the file on its own, almost as if the user requested the file directly. PHP processing is halted at that point, so no timeout errors regardless of the size of the file or the download speed of the client. And the end client gets the full benefits of Apache sending the file, such as an accurate file size report and download status bar.

The code I finally ended up with is too long to post here, but in general is uses the mod_xsendfile module if it is present, and if not the script falls back to using the code I originally posted. You can find some example code at https://gist.github.com/854168

EDIT

Just to have a reference of code that does the "chunking" Link to Original Code:

<?php 
function readfile_chunked ($filename,$type='array') { 
  $chunk_array=array(); 
  $chunksize = 1*(1024*1024); // how many bytes per chunk 
  $buffer = ''; 
  $handle = fopen($filename, 'rb'); 
  if ($handle === false) { 
   return false; 
  } 
  while (!feof($handle)) { 
      switch($type) 
      { 
          case'array': 
          // Returns Lines Array like file() 
          $lines[] = fgets($handle, $chunksize); 
          break; 
          case'string': 
          // Returns Lines String like file_get_contents() 
          $lines = fread($handle, $chunksize); 
          break; 
      } 
  } 
   fclose($handle); 
   return $lines; 
} 
?>
Community
  • 1
  • 1
Jim
  • 18,673
  • 5
  • 49
  • 65