So, I have an excel file with 28k rows.
I want to load it, then insert into database, but it was just stopped. (blank space)
I've tried to reduce into 5k data, and it worked, but is too slow
I also tried using chunk, with only 5k data, but I got "Maximum execution time of 300 seconds exceeded".
here's the code
Excel::filter('chunk')->load(storage_path('excel/exports/').$fileName)->chunk(1000, function($results)
{
foreach($results as $key)
{
// even nothing to do
}
});
Is 5k row really that big to handle?
Or am I doing it wrong?
Thanks.