I am creating sometimes large csv files from db information for users to then download - 100k or more rows. It appears I am running into a memory issue during the csv creation on some of the larger files. Here is an example of how I am currently handling creation of the csv.
Is there any way around this? Originally had 32mb and changed that to 64mb and still having the issue.
//columns array
$log_columns = array(
'1',
'2',
'3',
'4',
'5',
'6',
'7',
'8',
'9'
);
//results from the db
$results = $log_stmt->fetchAll(PDO::FETCH_ASSOC);
$log_file = 'test.csv';
$log_path = $_SERVER['DOCUMENT_ROOT'].'/../user-data/'.$_SESSION['user']['account_id'].'/downloads/';
// if location does not exist create it
if(!file_exists($log_path))
{
mkdir($log_path, 0755, true);
}
// open file handler
$fp = fopen($log_path.$log_file, 'wb');
// write the csv column titles / labels
fputcsv($fp, $log_columns);
//are there any logs?
if($results)
{
//write the rows
foreach($results as $row)
{
//rows array
$log_rows = array(
$row['1'],
$row['2'],
$row['3'],
$row['4'],
$row['5'],
$row['6'],
$row['7'],
$row['8'],
$row['9']
);
//write the rows
$newcsv = fputcsv($fp, $log_rows);
}//end foreach
}
// there were no results so just return an empty log
else
{
$newcsv = fputcsv($fp, array('No results found.') );
}
//close handler
fclose($fp);
// if csv was created return true
if($newcsv)
{
return true;
}
UPDATE :
Using a while loop and fetch instead of foreach and fetchAll still produces a memory error.
while($result = $log_stmt->fetch(PDO::FETCH_ASSOC))
How is that possible if I am only loading one one row at a time?
UPDATE 2 :
I have further tracked this down to the while loop using memory_get_usage();
echo (floor( memory_get_usage() / 1024) ).' kb<br />';
Before the while loop starts the result is 4658 kb and then for each iteration of the while loop it increases 1kb every 2-3 loops until it reaches the 32748 kb max memory allowed.
What can I do to solve this issue?
UPDATE 3 :
Played around more with this today... the way this works just does not make much sense to me - I can only assume it is a strange behavior with php's GC.
scenario 1 : My query gets all 80k rows and uses a while loop to output them. Memory used is around 4500kb after the query is fetched then increments 1kb every two to three rows that are outputted in the loop. Memory is not released what so ever and it crashes without enough memory at some point.
while($results = $log_stmt->fetch(PDO::FETCH_ASSOC))
{
echo $results['timestamp'].'<br/>';
}
scenario 2 : My query is now looped and gets 1000 rows at a time with a loop within that outputting each row. Memory maxes at 400k as it loops and completes the entire output with no memory issues.
For this example I just used a counter 80 times as I know there is more than 80k rows to retrieve. In reality I would have to do this different obviously.
$t_counter = 0;
while($t_counter < 80)
{
//set bindings
$binding = array(
'cw_start' => $t_counter * 1000,
//some other bindings...
);
$log_stmt->execute($binding);
echo $t_counter.' after statement '.floor( memory_get_usage() / 1024 ).' kb<br />';
while($results = $log_stmt->fetch(PDO::FETCH_ASSOC))
{
echo $results['capture_timestamp'].'<br/>';
}
echo $t_counter.' after while'.floor( memory_get_usage() / 1024 ).' kb<br />';
$t_counter++;
}
So I guess my question is why does the first scenario have incrementing memory usage and nothing is released? In that while loop there are no new variables and everything is 'reused'. The exact same situation happens in the second scenario just within another loop.