using MAMP v2.0 on Mac __ Apache/2.0.64 (Unix) -- PHP/5.3.5 -- DAV/2 mod_ssl/2.0.64 -- OpenSSL/0.9.7l -- MySQL 5.5.9
I have a script I am trying to run and It appears to be giving me major memory leaks, which I have attempted to debug and cannot work out how to fix.
Basically, the script is part of a file manager module. It processes the download of a file when given an ID.
The entire file is stored in a database table, as a BLOB, in 64kb chunks (per record), and is streamed down to the client on request.
Database: file_management
Tables: file_details, file_data
file_details:
FileID - int(10) AUTO_INCREMENT
FileTypeID - int(10)
FileType - varchar(60)
FileName - varchar(255)
FileDescription - varchar(255)
FileSize - bigint(20)
FileUploadDate - datetime
FileUploadBy - int(5)
file_details:
FileDataID - int(10) AUTO_INCREMENT
FileID - int(10)
FileData - BLOB
The error I am actually getting is this one (from php error log):
[31-Oct-2011 09:47:39] PHP Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 63326173 bytes) in /root/htdocs/file_manager/file_manager_download.php on line 150
Now, the actual function of downloading works if the file is small enough, in this case, less than 40mb, however once it goes over that, like the 60mb file in the error above, it fails. All it does is download a 0kb file.
Obviously, 134217728 bytes is more than 63326173 bytes (128mb vs 60mb).
the Allowed memory size of 134217728 bytes is the directive in php.ini: "memory_limit = 128M ; Maximum amount of memory a script may consume"
If I set this to 256M, it allows me to download that 60mb file, as well as up to about an 80mb file.
Also, if I set this to 1024M it allows me to download a 260mb file and possibly bigger.
So you can see that the problem is a leak somewhere in the script that is eating up all the memory.
Here is the download script:
ini_set('display_errors',1);
error_reporting(E_ALL & ~E_NOTICE);
$strDB=mysql_connect("localhost","username","password")or die ("Error connecting to mysql.. Error: (" . mysql_errno() . ") " . mysql_error());
$database=mysql_select_db("file_management",$strDB);
if (isset($_GET["id"])) {
// List of nodes representing each 64kb chunk
$nodelist = array();
// Pull file meta-data
$sql_GetFileDetails = "
SELECT
FileID,
FileTypeID,
FileType,
FileName,
FileDescription,
FileSize,
FileUploadDate,
FileUploadBy
FROM file_details WHERE FileID = '".$_GET["id"]."';";
$result_GetFileDetails = mysql_query($sql_GetFileDetails) or die ("No results for this FileID.
Your Query: " . $sql_GetFileDetails . "
Error: (" . mysql_errno() . ") " . mysql_error());
if (mysql_num_rows($result_GetFileDetails) != 1) { die ("A MySQL error has occurred.
Your Query: " . $sql_GetFileDetails . "
Error: (" . mysql_errno() . ") " . mysql_error()); }
// Set the file object to get details from
$FileDetailsArray = mysql_fetch_assoc($result_GetFileDetails);
// Pull the list of file inodes
$sql_GetFileDataNodeIDs = "SELECT FileDataID FROM file_data WHERE FileID = ".$_GET["id"]." order by FileDataID";
if (!$result_GetFileDataNodeIDs = mysql_query($sql_GetFileDataNodeIDs)) { die("Failure to retrive list of file inodes
Your Query: " . $sql_GetFileDataNodeIDs . "
Error: (" . mysql_errno() . ") " . mysql_error()); }
while ($row_GetFileDataNodeIDs = mysql_fetch_assoc($result_GetFileDataNodeIDs)) {
$nodelist[] = $row_GetFileDataNodeIDs["FileDataID"];
}
$FileExtension = explode(".",$FileDetailsArray["FileName"]);
$FileExtension = strtolower($FileExtension[1]);
// Determine Content Type
switch ($FileExtension) {
case "mp3": $ctype="audio/mp3"; break;
case "wav": $ctype="audio/wav"; break;
case "pdf": $ctype="application/pdf"; break;
//case "exe": $ctype="application/octet-stream"; break;
case "zip": $ctype="application/zip"; break;
case "doc": $ctype="application/msword"; break;
case "xls": $ctype="application/vnd.ms-excel"; break;
case "ppt": $ctype="application/vnd.ms-powerpoint"; break;
case "gif": $ctype="application/force-download"; break; // This forces download, instead of viewing in browser.
case "png": $ctype="application/force-download"; break; // This forces download, instead of viewing in browser.
case "jpeg": $ctype="application/force-download"; break; // This forces download, instead of viewing in browser.
case "jpg": $ctype="application/force-download"; break; // This forces download, instead of viewing in browser.
default: $ctype="application/force-download"; // This forces download, instead of viewing in browser.
}
// Send down the header to the client
header("Date: ".gmdate("D, j M Y H:i:s e", time()));
header("Cache-Control: max-age=2592000");
//header("Last-Modified: ".gmdate("D, j M Y H:i:s e", $info['mtime']));
//header("Etag: ".sprintf("\"%x-%x-%x\"", $info['ino'], $info['size'], $info['mtime']));
header("Accept-Ranges: bytes");
//header("Cache-Control: Expires ".gmdate("D, j M Y H:i:s e", $info['mtime']+2592000));
header("Pragma: public"); // required
header("Expires: 0");
header("Cache-Control: must-revalidate, post-check=0, pre-check=0");
header("Cache-Control: private",false); // required for certain browsers
header("Content-Description: File Transfer");
header("Content-Disposition: attachment; filename=\"".$FileDetailsArray["FileName"]."\"");
header("Content-Transfer-Encoding: binary");
header("Content-Type: ".$FileDetailsArray["FileSize"]);
ob_end_clean();
ob_start();
ob_start("ob_gzhandler");
$sql_GetFileDataBlobs = "SELECT FileData FROM file_data WHERE FileID = ".$_GET["id"]." ORDER BY FileDataID ASC;";
if (!$result_GetFileDataBlobs = mysql_query($sql_GetFileDataBlobs)) { die("Failure to retrive list of file inodes
Your Query: " . $sql_GetFileDataBlobs . "
Error: (" . mysql_errno() . ") " . mysql_error()); }
while ($row_GetFileDataBlobs = mysql_fetch_array($result_GetFileDataBlobs)) {
echo $row_GetFileDataBlobs["FileData"];
}
ob_end_flush();
header('Content-Length: '.ob_get_length());
ob_end_flush();
}
I have used Xdebug and output the results for peak memory usage, but nothing appears to be going anywhere near the limits, in total the peak memory usage was something like 900kb for the page.
So I am thinking it is aggregating the file chunks into memory and not letting them go, or something similar, but the file chunks are the only thing that would reach that amount of memory, causing the script to fail.
I can provide the script to upload a file to the database so you can test my script if you like, just let me know
Cheers for any help!
Mick
* ///////// SOLVED ///////// *
I just want to say thanks to hafichuk, great response and solved my entire issue.
The problem was two fold.
1 - I wasn't using ob_flush() inside the while loop. I added that in and it appeared to free up a lot of memory, enabling larger downloads, but not unlimited.
For example, with memory_limit = 128M i could now download more than 40mb, in fact i could now get up to around 200mb. But this is where it failed again. First memory issue problem solved though.
LESSON 1: Flush your objects!
2 - I was using mysql_query to retrieve the results for my SQL Query. The problem is that it buffers these results, and this was adding to my memory limit issue.
I ended up using mysql_unbuffered_query instead and this now works flawlessly.
This does however come with some limitations, that it locks your table while reading results.
LESSON 2: Don't buffer mysql results if no required! (within programmatic limitations)
FINAL LESSON:
All of these fixes work, however, it requires some more testing to ensure that there is no problems with the combination of them.
Also, I have learned a lot more about objects and php memory allocation, I just wish there was a way to visually debug the process a little better than what xdebug offers. If anyone has any ideas on how xdebug could have actually shed some light on this process, please let me know in the comments.
Hope this helps someone else out in the future.
Cheers
Mick