I run into a memory limit error using fetchAll
so I'm trying to use fetch
instead but I can't find a way to do it. Any suggestion? Where/How to use the while
instead of foreach
?
here is the original code:
// We get all the data from the table
$Qselect = $pdo->prepare('SELECT * FROM '.$table_to_export.'');
$Qselect->execute(array(''));
$results = $Qselect->fetchAll(PDO::FETCH_ASSOC); // Here is the problem
$countRmain = $Qselect->rowCount();
// We get the column names
$Qdescribe = $pdo->prepare('DESCRIBE '.$table_to_export.'');
$Qdescribe->execute();
$limit = $Qdescribe->rowCount()-1; // Number of column in the table
$table_fields = $Qdescribe->fetchAll(PDO::FETCH_COLUMN); // No problem here
foreach($table_fields as $key => $fields){
$outputCsv .= trim($fields).';';
}
// We remove the ; at the end
$outputCsv = rtrim($outputCsv, ';');
// Data
$outputCsv .= "\n";
if($countRmain > 0){
foreach($results as $row){
$column = 0 ;
foreach ($row as $key => $val){
if (is_null($val)){
$outputCsv .= 'NULL;'; // If the field id empty, we add NULL
}
else {
$outputCsv .= $val.';'; // We add the value in the file
}
if ($column == $limit)
$outputCsv .= "\n";
$column ++;
}
}
}
else
exit('No data to export');
I tried to include the foreach
loop into while($results = $Qselect->fetch()){
but that takes a really long time (10min for 50000 rows)
PS: If I increase the PHP memory_limit it works with fetchAll but I don't want this solution.