I have a CSV file with 30 k rows. I have used maatwebsite excel to import CSV in my pgsql database.
Problem is every time it's uploading 10k-12k data in database, then page giving HTTP ERROR 500
Error :
This page isn’t working
localhost is currently unable to handle this request.
HTTP ERROR 500
I have changes below variable in php.ini
max_execution_time=0
max_input_time=3000
post_max_size=128M
I have tried below code in ReportsImport
class UserReport implements ToModel, WithCustomCsvSettings,WithChunkReading,WithHeadingRow
{
public function model(array $row)
{
// dd($row);
return new UserReport([
'user' => $row['username],
'amount' => $row['amount']
]);
}
public function getCsvSettings(): array
{
return [
'input_encoding' => 'UTF-8'
];
}
public function chunkSize(): int
{
return 1000;
}
}
How can I resolved this HTTP 500 error
?
Error Log : local.ERROR: Allowed memory size of 536870912 bytes exhausted
Version : Laravel Framework 7.26.1