I have been having a lot of trouble getting the Laravel Excel package to export a large amount of data. I need to export about 80-100k rows so I implemented the queued export as mentioned in the docs. It works fine when I export a smaller amount of rows, but when I try to do 60-80k rows, it fails every time. While the jobs are being processed, I watch the temp file that is created, and I can see that the size of the file is increasing. I also watch the jobs in the database (I'm using the database queue driver), and I can see the jobs completing for a while. It seems that the jobs take incremently more time until the job fails. I don't get why the first several jobs are quick, and then they start taking more and more time to complete.
I'm using supervisor to manage the queue, so here's my config for that:
[program:laravel-worker]
process_name=%(program_name)s_%(process_num)02d
command=php /var/www/html/site/artisan queue:work --sleep=3 --tries=3 --timeout=120 --queue=exports,default
autostart=true
autorestart=true
user=www-data
numprocs=8
redirect_stderr=true
stdout_logfile=/var/log/supervisor/worker.log
loglevel=debug
And then my controller to create the export
(new NewExport($client, $year))->queue('public/exports/' . $name)->allOnQueue('exports')->chain([
new NotifyUserOfCompletedExport($request->user(), $name),
]);
I'm using:
Laravel 5.8, PHP 7.2, Postgresql 10.10
I should also mention that I have played around with the chunk size a bit, but in the end I've always run into the same problem. I tried chunk sizes of 500, 2000, 10000 but no luck.
In the failed_jobs table, the exception is MaxAttemptsExceededException
, although I have also got exceptions for InvalidArgumentException File '/path/to/temp/file' does not exist
. I'm not quite sure what else to do. I guess I could make it so it doesn't timeout, but that seems like it will just cause more problems. Any help would be appreciated.
EDIT
Here is the content of my Export Class:
class NewExport implements FromQuery, WithHeadings, WithMapping, WithStrictNullComparison
{
public function __construct($client, $year)
{
$this->year = $year;
$this->client = $client;
}
public function query()
{
$data = $this->getDataQuery();
return $data ;
}
public function headings(): array
{
$columns = [
//....
];
return $columns;
}
public function map($row): array
{
$mapping = [];
foreach($row as $key => $value) {
if(is_bool($value)) {
if($value) {
$mapping[$key] = "Yes";
} else {
$mapping[$key] = "No";
}
}else{
$mapping[$key] = $value;
}
}
return $mapping;
}
private function getDataQuery()
{
$query = \DB::table('my_table')->orderBy('my_field');
return $query;
}
The NotifyUserOfCompletedExport
class is just creating a job to email the logged in user that the export is finished with a link to download it.
class NotifyUserOfCompletedExport implements ShouldQueue
{
use Queueable, SerializesModels;
public $user;
public $filename;
public function __construct(User $user, $filename)
{
$this->user = $user;
$this->filename = $filename;
}
public function handle()
{
// This just sends the email
$this->user->notify(new ExportReady($this->filename, $this->user));
}
}
EDIT 2:
So I read this post, and I verified that eventually my server was just running out of memory. That lead to the MaxAttemptsExceededException
error. I added more memory to the server, and I am still getting the InvalidArgumentException File '/path/to/temp/file' does not exist
after the jobs have completed. It's even more weird though, because I can see that /path/to/temp/file actually does exist. So I have no idea what is going on here, but it's super frustrating.