0

I run into a memory limit error using fetchAll so I'm trying to use fetch instead but I can't find a way to do it. Any suggestion? Where/How to use the while instead of foreach ?

here is the original code:

// We get all the data from the table
$Qselect = $pdo->prepare('SELECT * FROM '.$table_to_export.''); 
$Qselect->execute(array(''));
$results = $Qselect->fetchAll(PDO::FETCH_ASSOC); // Here is the problem
$countRmain = $Qselect->rowCount();

// We get the column names
$Qdescribe = $pdo->prepare('DESCRIBE '.$table_to_export.'');
$Qdescribe->execute();
$limit = $Qdescribe->rowCount()-1;      // Number of column in the table
$table_fields = $Qdescribe->fetchAll(PDO::FETCH_COLUMN); // No problem here

foreach($table_fields as $key => $fields){
    $outputCsv .= trim($fields).';';
}

// We remove the ; at the end
$outputCsv = rtrim($outputCsv, ';');

// Data
$outputCsv .= "\n";
if($countRmain > 0){
    foreach($results as $row){
            $column = 0 ;
        foreach ($row as $key => $val){         
            if (is_null($val)){
                $outputCsv .= 'NULL;';  // If the field id empty, we add NULL
            }
            else {
                $outputCsv .= $val.';'; // We add the value in the file
            }
            if ($column == $limit)
                $outputCsv .= "\n";
            $column ++;
        }
    }
}
    else
        exit('No data to export');

I tried to include the foreach loop into while($results = $Qselect->fetch()){ but that takes a really long time (10min for 50000 rows)

PS: If I increase the PHP memory_limit it works with fetchAll but I don't want this solution.

remyremy
  • 3,548
  • 3
  • 39
  • 56

2 Answers2

0

Try this one.

1. comment line 4
2. Replace line 23:
    while($row = $Qselect->fetch(PDO::FETCH_ASSOC)){
3. Skip (or Replace) checks on line 22

Idea is simple: u got $result earlier, but not loading whole array. You just step-by-step loading records. That cant be slow enough even for 1M rows becouse u replace one piece of code iterations loop for another same code iterations.
If u still got time problems, try reorganize\optimize your code.

StasGrin
  • 1,800
  • 2
  • 14
  • 30
-2

I have also encounter this problem.

Increase the following variables so that your page execution will not stop:

max_input_time
memory_limit
max_execution_time

or you can use

while($row = $Qselect->fetch(PDO::FETCH_ASSOC)){

in place of fatchAll

Saty
  • 22,443
  • 7
  • 33
  • 51
Shail
  • 1,565
  • 11
  • 24
  • @remyremy. Its your choice, you can select any of choice. Either increase variable or use fetch as describe above. But you need to optimize code also while using fetch. – Shail Sep 10 '12 at 04:57
  • More people use this from their own wamp server so I do not want to choose this solution in the long term. I didn't vote down but I guess someone did it because in my original message I mentioned I don't want to change the memorylimit. Thanks anyway. I will try StasGrin answer soon and optimize code. – remyremy Sep 10 '12 at 16:27