0

I am creating sometimes large csv files from db information for users to then download - 100k or more rows. It appears I am running into a memory issue during the csv creation on some of the larger files. Here is an example of how I am currently handling creation of the csv.

Is there any way around this? Originally had 32mb and changed that to 64mb and still having the issue.

//columns array
$log_columns = array(
    '1',
    '2',
    '3',
    '4',
    '5',
    '6',
    '7',
    '8',
    '9'
    );

//results from the db
$results = $log_stmt->fetchAll(PDO::FETCH_ASSOC);

$log_file = 'test.csv';

$log_path = $_SERVER['DOCUMENT_ROOT'].'/../user-data/'.$_SESSION['user']['account_id'].'/downloads/';

// if location does not exist create it
if(!file_exists($log_path)) 
{
    mkdir($log_path, 0755, true);
}

// open file handler
$fp = fopen($log_path.$log_file, 'wb');

// write the csv column titles / labels
fputcsv($fp, $log_columns);

//are there any logs?
if($results)
{

    //write the rows
    foreach($results as $row)
    {

        //rows array
        $log_rows = array(
            $row['1'],
            $row['2'],
            $row['3'],
            $row['4'],
            $row['5'],
            $row['6'],
            $row['7'],
            $row['8'],
            $row['9']
        );
        //write the rows
        $newcsv = fputcsv($fp, $log_rows);

    }//end foreach
}
// there were no results so just return an empty log
else
{
    $newcsv = fputcsv($fp, array('No results found.') );
}

//close handler 
fclose($fp);

// if csv was created return true
if($newcsv)
{       
    return true;        
}

UPDATE :

Using a while loop and fetch instead of foreach and fetchAll still produces a memory error.

while($result = $log_stmt->fetch(PDO::FETCH_ASSOC))

How is that possible if I am only loading one one row at a time?

UPDATE 2 :

I have further tracked this down to the while loop using memory_get_usage();

echo (floor( memory_get_usage() / 1024) ).' kb<br />';

Before the while loop starts the result is 4658 kb and then for each iteration of the while loop it increases 1kb every 2-3 loops until it reaches the 32748 kb max memory allowed.

What can I do to solve this issue?

UPDATE 3 :

Played around more with this today... the way this works just does not make much sense to me - I can only assume it is a strange behavior with php's GC.

scenario 1 : My query gets all 80k rows and uses a while loop to output them. Memory used is around 4500kb after the query is fetched then increments 1kb every two to three rows that are outputted in the loop. Memory is not released what so ever and it crashes without enough memory at some point.

while($results = $log_stmt->fetch(PDO::FETCH_ASSOC))
{
    echo $results['timestamp'].'<br/>';
}

scenario 2 : My query is now looped and gets 1000 rows at a time with a loop within that outputting each row. Memory maxes at 400k as it loops and completes the entire output with no memory issues.

For this example I just used a counter 80 times as I know there is more than 80k rows to retrieve. In reality I would have to do this different obviously.

$t_counter = 0;

while($t_counter < 80)
{
    //set bindings
    $binding = array(
        'cw_start' => $t_counter * 1000,
             //some other bindings...
    );
    $log_stmt->execute($binding);

echo $t_counter.' after statement '.floor( memory_get_usage() / 1024 ).' kb<br />';

    while($results = $log_stmt->fetch(PDO::FETCH_ASSOC))
    {
        echo $results['capture_timestamp'].'<br/>';
    }

echo $t_counter.' after while'.floor( memory_get_usage() / 1024 ).' kb<br />';

$t_counter++;
}

So I guess my question is why does the first scenario have incrementing memory usage and nothing is released? In that while loop there are no new variables and everything is 'reused'. The exact same situation happens in the second scenario just within another loop.

user756659
  • 3,372
  • 13
  • 55
  • 110
  • Have you tried to work like pagination concept. Get the count & then iterate over it to generate csv. – Rajesh Ujade Oct 06 '14 at 06:31
  • I think you can try adding ````ob_end_clean();```` before ````$fp = fopen($log_path.$log_file, 'wb');```` – Alex Szabo Oct 06 '14 at 06:32
  • @rajeshujade - not following... what do you mean? I have an array of row data in $results then I use a foreach loop to add each row to the csv... – user756659 Oct 06 '14 at 06:33
  • @Alex - there wouldn't be a buffer 'open' in the first place though right? That would erase the buffer and turn it off, but there shouldn't be one. – user756659 Oct 06 '14 at 06:36

3 Answers3

1

fetchAll fetches all records, who not just query it and do a while loop with fetch then it does not need to load all the result set in memory.

http://php.net/manual/en/pdostatement.fetch.php

Chris
  • 8,168
  • 8
  • 36
  • 51
0

Then i think you should try reading the files in bits. Read them and append into one csv file,that way you free memory during the process . You could do count(*) ,but try to find the total count before the multiple collection

eddymens
  • 1
  • 5
  • So count(*) on the query to get the total then fetch it multiple times using limits? Or is there a more preferred method for this? – user756659 Oct 07 '14 at 01:19
-1

I have been using php's csv myself, i even use it as a databse system(nosql) try

csv code for reading <?php
$CSVfp = fopen("filename.csv", "r"); if($CSVfp !== FALSE) { $con=1; while(! feof($CSVfp)) { do something }?>

**csv code for writting **

<?php
$list = array
(
"edmond,dog,cat,redonton",
"Glenn,Quagmire,Oslo,Norway",
);$file = fopen("filename.csv","w");foreach ($list as $line)      

{fputcsv($file,explode(',',$line));}fclose($file); ?>
eddymens
  • 1
  • 5
  • I think you missed the point of the post - this code essentially does the same thing mine does. Now try yours with 80k rows in your $list array. – user756659 Oct 06 '14 at 19:51