I'm trying to loop through a 3mb Excel document, to get all the data I will then have to insert into the database. The worksheet I'm using has got 6500 rows, but it might vary in the future. I've noticed that even though I'm using recommended memory saving techniques, it still trips over
$reader = PHPExcel_IOFactory::createReaderForFile($file_path);
$reader->setReadDataOnly(true);
//$sheets = $this->getWorksheetNames($file['tmp_name'], 0);
$reader->setLoadSheetsOnly('spreadsheetname');
$chunkFilter = new IPO_Reader();
$reader->setReadFilter($chunkFilter);
$highestRow = 10000; //$this->objWorksheet->getHighestRow();
$chunkSize = 1;
$highestColumn = "Y";
for ($startRow = 2; $startRow <= $highestRow; $startRow += $chunkSize)
{
$chunkFilter->setRows($startRow, $chunkSize);
$objPHPExcel = $reader->load($file_path);
for($row = $startRow ; $row <= $startRow + $chunkSize; $row++)
{
$this->read_row = $objPHPExcel->getActiveSheet()->rangeToArray('A'.$row.':'.$highestColumn.$row, null, true, true, true);
$this->read_row = end($this->read_row);
foreach($this->read_row as $column => $value)
{
$db_column_name = $this->_getDbColumnMap($column);
if(!empty($db_column_name))
{
$this->new_data_row[$db_column_name] = $this->_getRowData($value, $column);
}
}
$this->read_row = null;
$this->new_data_row['date_uploaded'] = date("Y-m-d H:i:s");
$this->new_data_row['source_file_name'] = $file_name;
$ipo_row = new Model_UploadData_IPO();
$ipo_row->create($this->new_data_row);
$this->new_data_row = null;
unset($ipo_row);
gc_collect_cycles();
}
$objPHPExcel->disconnectWorksheets();
unset($objPHPExcel);
gc_collect_cycles();
when I test the memory usage before I unset the objPHPExcel and after, there is no memory gain, I'm really confused about it, as the split into chunks does not seem to allow me to clear the memory after each chunk, and the usage gradually rises, and with a limit set to 250MB, it only allows me to add ~500 records