0

I am trying to upload a file to S3 using a job queue, but I keep getting a MaxAttemptsExceedException error. I have tried increasing the number of retries in the job queue, but it doesn't seem to help. It works fine in sync but when I changed it to the Redis queue, it threw an error. Has anyone else encountered this issue and found a solution?

The file size is too small, it takes milliseconds if I run without a queue.

Here is my queue config


        'redis' => [
            'driver' => 'redis',
            'connection' => 'default',
            'queue' => env('REDIS_QUEUE', 'default'),
            'retry_after' => 90,
            'block_for' => null,
            'after_commit' => false,
        ],
Nahid Bin Azhar
  • 703
  • 1
  • 5
  • 18
  • check the log there must be an error logged why your job failed so it was sent to the retry or it must be listed in failed jobs table. Also, have you installed the Redis and the necessary package for it? – Anuj Shrestha Jan 04 '23 at 11:17

0 Answers0