0

I have a CSV file with 30 k rows. I have used maatwebsite excel to import CSV in my pgsql database.

Problem is every time it's uploading 10k-12k data in database, then page giving HTTP ERROR 500

Error :

This page isn’t working
localhost is currently unable to handle this request.
HTTP ERROR 500

I have changes below variable in php.ini

max_execution_time=0
max_input_time=3000
post_max_size=128M

I have tried below code in ReportsImport

class UserReport implements ToModel, WithCustomCsvSettings,WithChunkReading,WithHeadingRow
{
    public function model(array $row)
    {
          // dd($row);
          return new UserReport([
              'user'     => $row['username],
              'amount'   => $row['amount']
          ]);
    }
    
    public function getCsvSettings(): array
    {
         return [
              'input_encoding' => 'UTF-8'
         ];
    }
    
    public function chunkSize(): int
    {
        return 1000;
    }
}

How can I resolved this HTTP 500 error ?

Error Log : local.ERROR: Allowed memory size of 536870912 bytes exhausted

Version : Laravel Framework 7.26.1

Niloy Rony
  • 602
  • 1
  • 8
  • 23

1 Answers1

2

The problem is that the application is trying to hold too much data in memory. I see you're already using Chunk reading but it appears not to be enough

To decrease the memory consumption when reading the table data, try decreasing your chunk size.

To decrease the memory consumption of the models, try adding Batch inserts in addition to chunk reading .

Thomas
  • 8,426
  • 1
  • 25
  • 49