0

Laravel version: 7.x

I have a test data file of 1161 rows, which needs to be uploaded via ajax and rendered in the editable format, in case of validations failed. So I am using Excel::toCollection(...). But, even with this small amount of data, the application hangs for a while, until the rendering is complete.

To handle this situation I reading only 50 rows at a time, validate and append in the form via ajax. However, it has reduced the load and also decrease the rendering time, but each time I have to read the entire file and get the next chuck out of it.

Here is my code:

$arrayPost = request()->only([
    'start',
    'limit',
]);

$arrayExcel = \Excel::toCollection(new SubscriberImport(), <uploaded_file_path>);
$arrayExcel = array_slice($arrayExcel, $arrayPost['start'], $arrayPost['limit']);

$validated = $this->validateExcel($arrayExcel);

if($validated->fails())
{
    # return with error
    return view('<form_path>', [
        'arrayExcel' => $arrayExcel,
    ])->withErrors($validated->getMessageBag()->toArray());
}
else
{
    ...
}

Is there a way to fetch the specific rows from the file.?

Mr.Singh
  • 1,421
  • 6
  • 21
  • 46
  • Have you tried chunk reading? https://docs.laravel-excel.com/3.1/imports/chunk-reading.html – Leonardo Vitali Jun 22 '20 at 12:07
  • I have tried this. But, this didn't solve my problem. Because, I need to read only 50 rows at a time and the next set of 50 rows will be handled in the next ajax request and so on, until the all rows have been appended to the form. – Mr.Singh Jun 22 '20 at 12:14

0 Answers0