0

I am using Laravel Excel package in my project. I have a problem that when I am importing the rows, the script stops during the process with message concerning the Memory usage limit. That is really strange, considering that I am processing the data in chunks, in order to avoid the memory related problems. Here is the code:

public function process($file, Import $import)
{
    $newArr = $this->getColumnNamesReformatted($file);

    Excel::filter('chunk')->load($file)->chunk(1000, $this->setData($newArr, $import));
}

/**
 * @param $newArr
 * @param $import
 * @return \Closure
 */
public function setData($newArr, $import)
{
    return function ($results) use ($newArr, $import){

        foreach ($results as $row){
            $values = array_values($row->all());
            $data = [
                'import_id' => $import->id
            ];
            foreach ($values as $key => $value){
                if(in_array($newArr[$key], config('billing.columns'))){
                    $data[$newArr[$key]] = $value;
                }
            }

            $data['usage_start_at'] = Carbon::parse($data['lineItem_UsageStartDate']);
            $data['usage_end_at']   = Carbon::parse($data['lineItem_UsageEndDate']);
            Record::create($data);

            echo "|";
        }
    };
}

I just cant seem to understand where is the memory leak. By the way - the problem only occures when I am running the code in a Vagrant VM with 1 gb RAM. When I am running the same code locally - it does all the required without any troubles.

naneri
  • 3,771
  • 2
  • 29
  • 53

1 Answers1

0

Have you tried increasing "memory_limit" in php.ini. If not try once!

php -i | grep "php.ini"

By using the above command you will get loaded configuration file.

Increase the memory_limit and restart the web server.