0

I'm writing a web-app to parse Excel files, each contains a lot of data (~47 columns and thousands of rows). The framework is Laravel 4.2 and the package used is laravel-excel (maatwebsite/excel).

When I wrote all the code, I had a sample file from the customer, it contained 620 rows and everything worked fine. Now, some files work and most of them do not. The errors are strange. Ok, step by step:

Logic

A user selects a file via <input type="file">, then this file is sent to the server via $.ajax, the server creates a Job instance with properties of this file and returns this instance to the client. Client receives this Job instance and sees if progress for this Job (i.e. number of rows parsed) is lower than total (i.e. total number of rows). If it is so, the client sends a request to the server to execute this Job (i.e. to parse a certain amount of rows more, e.g. 200). So there is always a dialogue between the client and server, something like this:

  • Client: Dear Server, hereby I upload this file orders_123.xlsx.
  • Server: Dear Client, thank you, I stored your file and created a Job with id = 27. Your file has total = 623 rows, and the current progress = 0.
  • Client: Dear Server, thank you, please, execute this Job with id = 27, taking 200 rows. Reply ASAP.
  • Server: Dear Client, I did what you asked, your Job with id = 27 now has progress = 200.
  • Client: Ok, Server, please, execute this Job again and again take 200 rows.
  • And so it continues, until the Job is finished.

You may ask why I did it so strangely instead of just asking the server to import ALL of rows, but here again, I found that some dark magic is involved here, and this way is the only way it works most of the time (the server fails otherwise).

JavaScript

function uploadFile(file) {
    var data = new FormData();
    data.append("file", file);
    showProgressBar(file.name);
    $.ajax({
        type: "POST",
        url: "/import/orders",
        data: data,
        cache: false,
        processData: false,
        contentType: false,
        success: function(response) {
            if (response.status == "error") {
                hideProgressBar(file.name + ": Error! " + response.data, response.status);  
            } else if (response.status == "success") {
                executeJob(response.job, 100);
            }
        },
        xhr: function() {
            var xhr = $.ajaxSettings.xhr();
            if (typeof xhr.upload === "object") {
                xhr.upload.addEventListener("progress", function(e) {
                    if (e.lengthComputable) {
                        var val = Math.floor(100 * e.total / e.loaded)
                        updateProgressBar(val);
                    }
                }, false);
            }
            return xhr;
        }
    });
}

function executeJob(job, take) {
    $.ajax({
        type: "POST",
        url: "/jobs/execute",
        data: {
            job: job,
            take: take
        },
        success: function(response) {
            if (response.status == "error") {
                hideProgressBar(job.original_name + ": Error! " + response.data, response.status);
            } else if (response.status == "success") {
                updateProgressBar(Math.floor(100 * response.job.progress / response.job.total));
                if (val >= 100) {
                    hideProgressBar(job.original_name + ": Success!", response.status)
                    deleteJob(job);
                } else {
                    executeJob(job, take);
                }
            }
        }
    }, "json");
}

Routes

Route::post('/import/orders', array('before' => 'csrf', 'uses' => 'OrdersFPController@handleOrdersImport'));
Route::post('/jobs/execute', array('before' => 'csrf', 'uses' => 'JobsController@handleExecute'));
Route::post('/jobs/delete', array('before' => 'csrf', 'uses' => 'JobsController@handleDelete'));

OrdersFPController@handleOrdersImport

class OrdersFPController extends BaseController {

    public function handleOrdersImport()
    {
        $file = Input::file('file');
        $fields = ['order', 'location', ...];

        if (!$file->isValid()) {
            return Response::json(array('status' => 'error', 'data' => 'File is invalid.'));
        }

        $filename = $file->getClientOriginalName();
        $extension = $file->getClientOriginalExtension();
        $extension_guessed = $file->guessExtension();

        if ($extension != $extension_guessed) {
            return Response::json(array('status' => 'error', 'data' => 'Wrong extension of the file: ".' . $extension . '", should be ".' . $extension_guessed . '".'));
        }

        $filename_new = str_random(20) . '.' . $extension;
        $path = public_path() . '/assets/import/orders';
        $file->move($path, $filename_new);

        $sheet = Excel::load($path . '/' . $filename_new, function($reader) {})->get();

        if (is_null($sheet)) {
            File::delete($path . '/' . $filename_new);
            return Response::json(array('status' => 'error', 'data' => 'Could not load any sheets in the file.'));
        }

        $job_total = $sheet->count();
        if ($job_total < 1) {
            File::delete($path . '/' . $filename_new);
            return Response::json(array('status' => 'error', 'data' => 'No data could be read in the file.'));
        }

        $sample = $sheet[0];

        foreach($fields as $f) {
            if (!isset($sample->$f)) {
                File::delete($path . '/' . $filename_new);
                return Response::json(array('status' => 'error', 'data' => 'Fields are missing for the selected type.'));
            }
        }


        $job = new Job;
        $job->type = 'orders';
        $job->link = $path . '/' . $filename_new;
        $job->original_name = $filename;
        $job->total = $job_total;
        $job->user()->associate(Auth::user());
        $job->save();

        return Response::json(array('status' => 'success', 'job' => $job, 'data' => 'File uploaded.'));
    }

}

The problem is that sometimes when the script reaches $sheet = Excel::load($path . '/' . $filename_new, function($reader) {})->get();, the server returns Error 500 (Internal server error). Sometimes it even kills my server (no kidding), stopping the php artisan serve command in the terminal. Some screenshots:

Safari

Safari 1 https://i.stack.imgur.com/3gkpr.png

Safari 2 https://i.stack.imgur.com/m8V4s.png

Firefox

Firefox 1 https://i.stack.imgur.com/QVceP.png

Firefox 2 https://i.stack.imgur.com/IcOQF.png

Firefox 3 https://i.stack.imgur.com/xvFoN.png

Firefox 4 https://i.stack.imgur.com/flx3K.png

Well, this is the problem. The server returns nothing, just an error, no description.

EDIT:

As mentioned by @lukasgeiter, I checked the log file. When the code is Excel::filter('chunk')->load($path . '/' . $filename_new)->chunk(50, function($results) { /// });, the output is as follows:

[2015-01-28 20:00:02] production.ERROR: exception 'Symfony\Component\Debug\Exception\FatalErrorException' with message 'Maximum execution time of 60 seconds exceeded' in /Users/antonsinyakin/Documents/projects/sites/foodpanda/vendor/phpoffice/phpexcel/Classes/PHPExcel/Reader/Excel2007.php:834
Stack trace:
#0 [internal function]: Illuminate\Exception\Handler->handleShutdown()
#1 {main} [] []

If a regular $sheet = Excel::load($path . '/' . $filename_new, function($reader) {})->get(); is used, nothing is written to the log file.

halfer
  • 19,824
  • 17
  • 99
  • 186
Anton Egorov
  • 1,328
  • 1
  • 16
  • 33
  • Some additional info: I tried using `Excel::filter('chunk')->load($path . '/' . $filename_new)->chunk(50, function($results) { /// });` and this is **exactly** the case, when `php artisan serve` is getting killed in the terminal. No description, just `connection lost` error. – Anton Egorov Jan 28 '15 at 10:41
  • Open `app/storage/logs/laravel.log` and see if you find a more descriptive error message. – lukasgeiter Jan 28 '15 at 11:52
  • @lukasgeiter Thanks, I updated the question with the log file output. – Anton Egorov Jan 28 '15 at 12:06
  • Try reducing chunk size even further or increase application execution timeout? – Jure Špik Jun 15 '15 at 11:27

0 Answers0