0

My Project consumes several 3rd party APIs which enforce requests limiting. My Project calls these api's through Laravel Jobs. I am using using Spatie/aravel-rate-limited-job-middleware for rate limiting

Once a Project is submitted, around 60 jobs are dispatched on an average. These jobs needs to be executed as 1 Job/Minute

There is one supervisord program running 2 process of the default queue with --tries=3 also in config/queue.php for redis I am using 'retry_after' => (60 * 15) to avoid retrying while job is executing.

My current Rate Limiter middleware is coded this way

return (new RateLimited())
            ->allow(1)
            ->everySeconds(60)
            ->releaseAfterBackoff($this->attempts());

What happens is that 3 jobs get processed in 3 mins, but after that all jobs gets failed.

what I can understand is all jobs are requeued every min and once they cross tries threshold (3), they are moved to failed_jobs.

I tried removing --tries flags but that didn't work. I also tried increasing --tries=20, but then jobs fails after 20 mins. I don't want to hardcode the --tries flag as in some situation more than 100 jobs can be dispatched.

I also want to increase no of queue workers process in the supervisor so that few jobs can execute parallely.

I understand it is issue with configuring retry, timeouts flags but I don't understand how. Need Help...

Pankaj Jha
  • 886
  • 15
  • 37
  • first of all check supervisor log there may be some timeout issue at server end , check the logs of laravel and Server(Apache/Nginx) you can have idea why jobs are failing – Shawinder Jit Singh Mar 18 '21 at 09:58
  • jobs are failing due to max Attempt. All failed jobs are going to failed jobs table. `Illuminate\Queue\MaxAttemptsExceededException` – Pankaj Jha Mar 18 '21 at 10:12
  • it seems function called inside job is getting error ,that`s why it goes to retry. because usually it should not be retry – Shawinder Jit Singh Mar 19 '21 at 04:01
  • I firmly believe This is issue with rate limiting, as the error is `MaxAttemptsExceededException`. When I disable ratelimiting, everything works fine. I have tested the jobs carefully. Only thing I don't understand is the flow of how laravel implies ratelimiting through redis. – Pankaj Jha Mar 19 '21 at 05:21
  • have you mentioned maximum tries in supervisor configuration file ? – Shawinder Jit Singh Mar 19 '21 at 07:33
  • Yes, There is one supervisord program running 2 process of the default queue with --tries=3 – Pankaj Jha Mar 19 '21 at 08:07
  • In my case i have nt used ratelimited middleware might be this can be solve problem . Try once without middlewere might be got some hint – Shawinder Jit Singh Mar 19 '21 at 08:34
  • Let us [continue this discussion in chat](https://chat.stackoverflow.com/rooms/230119/discussion-between-shawinder-jit-singh-and-pankaj-jha). – Shawinder Jit Singh Mar 19 '21 at 09:59

0 Answers0