3

using MAMP v2.0 on Mac __ Apache/2.0.64 (Unix) -- PHP/5.3.5 -- DAV/2 mod_ssl/2.0.64 -- OpenSSL/0.9.7l -- MySQL 5.5.9

I have a script I am trying to run and It appears to be giving me major memory leaks, which I have attempted to debug and cannot work out how to fix.

Basically, the script is part of a file manager module. It processes the download of a file when given an ID.

The entire file is stored in a database table, as a BLOB, in 64kb chunks (per record), and is streamed down to the client on request.

Database: file_management

Tables: file_details, file_data

file_details:
FileID - int(10) AUTO_INCREMENT
FileTypeID - int(10)
FileType - varchar(60)
FileName - varchar(255)
FileDescription - varchar(255)
FileSize - bigint(20)
FileUploadDate - datetime
FileUploadBy - int(5)

file_details:
FileDataID - int(10) AUTO_INCREMENT
FileID - int(10)
FileData - BLOB

The error I am actually getting is this one (from php error log):

[31-Oct-2011 09:47:39] PHP Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 63326173 bytes) in /root/htdocs/file_manager/file_manager_download.php on line 150

Now, the actual function of downloading works if the file is small enough, in this case, less than 40mb, however once it goes over that, like the 60mb file in the error above, it fails. All it does is download a 0kb file.

Obviously, 134217728 bytes is more than 63326173 bytes (128mb vs 60mb).

the Allowed memory size of 134217728 bytes is the directive in php.ini: "memory_limit = 128M ; Maximum amount of memory a script may consume"

If I set this to 256M, it allows me to download that 60mb file, as well as up to about an 80mb file.

Also, if I set this to 1024M it allows me to download a 260mb file and possibly bigger.

So you can see that the problem is a leak somewhere in the script that is eating up all the memory.

Here is the download script:



    ini_set('display_errors',1);
error_reporting(E_ALL & ~E_NOTICE);

$strDB=mysql_connect("localhost","username","password")or die ("Error connecting to mysql.. Error: (" . mysql_errno() . ") " . mysql_error());
$database=mysql_select_db("file_management",$strDB);

if (isset($_GET["id"])) {

    // List of nodes representing each 64kb chunk
    $nodelist = array();

    // Pull file meta-data
    $sql_GetFileDetails = "
    SELECT 
    FileID,
    FileTypeID,
    FileType,
    FileName,
    FileDescription,
    FileSize,
    FileUploadDate,
    FileUploadBy
    FROM file_details WHERE FileID = '".$_GET["id"]."';";

    $result_GetFileDetails = mysql_query($sql_GetFileDetails) or die ("No results for this FileID.
Your Query: " . $sql_GetFileDetails . "
Error: (" . mysql_errno() . ") " . mysql_error()); if (mysql_num_rows($result_GetFileDetails) != 1) { die ("A MySQL error has occurred.
Your Query: " . $sql_GetFileDetails . "
Error: (" . mysql_errno() . ") " . mysql_error()); } // Set the file object to get details from $FileDetailsArray = mysql_fetch_assoc($result_GetFileDetails); // Pull the list of file inodes $sql_GetFileDataNodeIDs = "SELECT FileDataID FROM file_data WHERE FileID = ".$_GET["id"]." order by FileDataID"; if (!$result_GetFileDataNodeIDs = mysql_query($sql_GetFileDataNodeIDs)) { die("Failure to retrive list of file inodes
Your Query: " . $sql_GetFileDataNodeIDs . "
Error: (" . mysql_errno() . ") " . mysql_error()); } while ($row_GetFileDataNodeIDs = mysql_fetch_assoc($result_GetFileDataNodeIDs)) { $nodelist[] = $row_GetFileDataNodeIDs["FileDataID"]; } $FileExtension = explode(".",$FileDetailsArray["FileName"]); $FileExtension = strtolower($FileExtension[1]); // Determine Content Type switch ($FileExtension) { case "mp3": $ctype="audio/mp3"; break; case "wav": $ctype="audio/wav"; break; case "pdf": $ctype="application/pdf"; break; //case "exe": $ctype="application/octet-stream"; break; case "zip": $ctype="application/zip"; break; case "doc": $ctype="application/msword"; break; case "xls": $ctype="application/vnd.ms-excel"; break; case "ppt": $ctype="application/vnd.ms-powerpoint"; break; case "gif": $ctype="application/force-download"; break; // This forces download, instead of viewing in browser. case "png": $ctype="application/force-download"; break; // This forces download, instead of viewing in browser. case "jpeg": $ctype="application/force-download"; break; // This forces download, instead of viewing in browser. case "jpg": $ctype="application/force-download"; break; // This forces download, instead of viewing in browser. default: $ctype="application/force-download"; // This forces download, instead of viewing in browser. } // Send down the header to the client header("Date: ".gmdate("D, j M Y H:i:s e", time())); header("Cache-Control: max-age=2592000"); //header("Last-Modified: ".gmdate("D, j M Y H:i:s e", $info['mtime'])); //header("Etag: ".sprintf("\"%x-%x-%x\"", $info['ino'], $info['size'], $info['mtime'])); header("Accept-Ranges: bytes"); //header("Cache-Control: Expires ".gmdate("D, j M Y H:i:s e", $info['mtime']+2592000)); header("Pragma: public"); // required header("Expires: 0"); header("Cache-Control: must-revalidate, post-check=0, pre-check=0"); header("Cache-Control: private",false); // required for certain browsers header("Content-Description: File Transfer"); header("Content-Disposition: attachment; filename=\"".$FileDetailsArray["FileName"]."\""); header("Content-Transfer-Encoding: binary"); header("Content-Type: ".$FileDetailsArray["FileSize"]); ob_end_clean(); ob_start(); ob_start("ob_gzhandler"); $sql_GetFileDataBlobs = "SELECT FileData FROM file_data WHERE FileID = ".$_GET["id"]." ORDER BY FileDataID ASC;"; if (!$result_GetFileDataBlobs = mysql_query($sql_GetFileDataBlobs)) { die("Failure to retrive list of file inodes
Your Query: " . $sql_GetFileDataBlobs . "
Error: (" . mysql_errno() . ") " . mysql_error()); } while ($row_GetFileDataBlobs = mysql_fetch_array($result_GetFileDataBlobs)) { echo $row_GetFileDataBlobs["FileData"]; } ob_end_flush(); header('Content-Length: '.ob_get_length()); ob_end_flush(); }

I have used Xdebug and output the results for peak memory usage, but nothing appears to be going anywhere near the limits, in total the peak memory usage was something like 900kb for the page.

So I am thinking it is aggregating the file chunks into memory and not letting them go, or something similar, but the file chunks are the only thing that would reach that amount of memory, causing the script to fail.

I can provide the script to upload a file to the database so you can test my script if you like, just let me know

Cheers for any help!

Mick


* ///////// SOLVED ///////// *

I just want to say thanks to hafichuk, great response and solved my entire issue.

The problem was two fold.

1 - I wasn't using ob_flush() inside the while loop. I added that in and it appeared to free up a lot of memory, enabling larger downloads, but not unlimited.

For example, with memory_limit = 128M i could now download more than 40mb, in fact i could now get up to around 200mb. But this is where it failed again. First memory issue problem solved though.

LESSON 1: Flush your objects!

2 - I was using mysql_query to retrieve the results for my SQL Query. The problem is that it buffers these results, and this was adding to my memory limit issue.

I ended up using mysql_unbuffered_query instead and this now works flawlessly.

This does however come with some limitations, that it locks your table while reading results.

LESSON 2: Don't buffer mysql results if no required! (within programmatic limitations)

FINAL LESSON:

All of these fixes work, however, it requires some more testing to ensure that there is no problems with the combination of them.

Also, I have learned a lot more about objects and php memory allocation, I just wish there was a way to visually debug the process a little better than what xdebug offers. If anyone has any ideas on how xdebug could have actually shed some light on this process, please let me know in the comments.

Hope this helps someone else out in the future.

Cheers

Mick

Quantico773
  • 253
  • 1
  • 2
  • 7
  • Have you tried http://php.net/manual/en/function.ob-flush.php in your while loop? – hafichuk Oct 31 '11 at 01:18
  • hafichuk you are a legend! That works (almost) perfectly. Just to let you know, it appears to work to the full allowance of the memory_limit directive, so if I have it set to 128M, and try a 140mb file, it works, but if I try to download a 250mb file, it gets to about 200mb streamed and fails, with the php error: [31-Oct-2011 11:46:01] PHP Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 132392961 bytes) in /root/htdocs/file_manager/file_manager_download.php on line 125, which is the ob_flush(); line. – Quantico773 Oct 31 '11 at 01:49
  • mysql_unbuffered_query saved my day, thanks. – webtweakers May 19 '12 at 01:49
  • To get rid of the table locking issue you could select the CHAR_LEN of the blob in mysql, then successively call SUBSTR on the field in chunks of X bytes. This is like a poor man's buffering inside PHP, but it should free up your table lock for microseconds at a time while the ob_flush() is doing its thing. It can even help you control the download speed via sleep() between DB calls. This is how I stream images from my CMS. – chugadie Mar 18 '13 at 15:39

1 Answers1

1

You should just need to do an "ob_flush()" in your while loop. This will clear out the buffer to the page. Your last header listing the content length will need to be removed, since you can't send a header after the data has started. This shouldn't be an issue downloading the file, only updating the progress meter for the download.

hafichuk
  • 10,351
  • 10
  • 38
  • 53
  • Hi hafichuk, I added this line into the while loop, and as I replied in comment above, it appears to work but only to the limit of the memory_limit directive. If I am to be downloading files that could be say 500mb, would I need to increase the memory_limit directive in php.ini? or is the ob_flush able to handle any file size, regardless of the directive, but just that my code is incorrect in another place? Many thanks for your help. – Quantico773 Oct 31 '11 at 02:05
  • Your mysql_fetch_X buffers data so you may need to look at using http://www.php.net/manual/en/function.mysql-unbuffered-query.php. – hafichuk Oct 31 '11 at 02:19
  • You might have better luck if you look at storing the files outside of the database and using http://www.php.net/manual/en/function.fpassthru.php. – hafichuk Oct 31 '11 at 02:22
  • Ok, i put in the mysql_unbuffered_query and it works perfectly. Just to expand on the function of mysql_unbuffered_query, it locks the table for any writes/reads while its getting the result set, but this is fine for the application i require. Also, in reply to your advice to move the files out of the database, i have many good reasons for having them in the database, and (now) not one reason for using the file system! I have many reasons for now using the file system though. All to do with Security, storage, collating, access privs, etc.. Hafichuk, once again thank you so much for your help! – Quantico773 Oct 31 '11 at 02:42