0

I'm using a session controlled script to allow users to get files from my server that are outside the web-readable area of the site. The problem lately has been that the files are never finishing. The browser says it's all done, but the files aren't complete. I'm on a shared host (Arvixe) and this is a new problem. They allegedly have extended the php timeout limit to an hour, but that hasn't helped.

Here's the code that does the work:

header("Content-length: $filesize");
header("Cache-control: private"); //use this to open files directly
while( !feof ( $filetoget ) ) {
    $buff = fread ( $filetoget, 1024 );
    ob_clean();
    flush();
    echo $buff;
}
fclose ($filetoget);

From a similar thread here I added these two lines, but that doesn't seem to have helped at all either:

ob_clean();
flush();
Lido
  • 15
  • 6
  • Does the error log give any hints? Have you tried changing the memory_limit? – ghbarratt Aug 03 '12 at 01:44
  • Try with small files. Is it working? – Hawili Aug 03 '12 at 22:50
  • Small files (25MB) work fine. I upped the chunk size to 8192 and got it to work. Haven't tried memory_limit and error log (at least what i have access to on this shared host) is empty. – Lido Aug 04 '12 at 00:58

1 Answers1

0

use set_time_limit(0) it will make the script run forever (as long as needed)

Hawili
  • 1,649
  • 11
  • 15