3

So, I have an excel file with 28k rows.
I want to load it, then insert into database, but it was just stopped. (blank space)
I've tried to reduce into 5k data, and it worked, but is too slow
I also tried using chunk, with only 5k data, but I got "Maximum execution time of 300 seconds exceeded".
here's the code

Excel::filter('chunk')->load(storage_path('excel/exports/').$fileName)->chunk(1000, function($results)
    {
        foreach($results as $key)
        {
            // even nothing to do
        }
    });

Is 5k row really that big to handle?
Or am I doing it wrong?
Thanks.

Irfandi D. Vendy
  • 894
  • 12
  • 20
  • add `set_timeout(0);` to avoid timelimit (or modify it in php.ini) – maztch May 08 '15 at 10:03
  • that's my last option. So,. there r no other way? I mean, am I doing it right? – Irfandi D. Vendy May 08 '15 at 10:09
  • Isn't there an import function in your database admin panel? – Cas Bloem May 08 '15 at 10:27
  • ^what do you mean? my application is very simple., admin will have an option to upload excel file, then from controller, it will process the excel's data to insert into database. Inserting 28k data is easy, but processing 28 excel row is my current problem... – Irfandi D. Vendy May 09 '15 at 01:32
  • Here is a faster alternative to Laravel Excel: https://github.com/rap2hpoutre/fast-excel It also uses less memory – rap-2-h Apr 09 '18 at 10:15

2 Answers2

1

You're doing it by the book. (Using chuck, for example)
But 28k rows is much data to handle.

You can edit your maximum execution time.
see: http://php.net/manual/en/function.set-time-limit.php

bool set_time_limit ( int $seconds )

Hope this will help.

Cas Bloem
  • 4,846
  • 2
  • 24
  • 23
  • Yea, just like what I said in the comment, that will be my last option., so, 28k is considered massive eh? I've tried to reduce to 10k data, and it took ~10 min to iterate 'em all., – Irfandi D. Vendy May 09 '15 at 01:26
1

Using chunk is great to prevent over exhausting memory but it will slow down your execution time.

Increase the number of chunk if you want it faster but be careful with that.

Note. Every end of chunk, your application gonna read the file again and that take time.

thavorac
  • 71
  • 7