My Project consumes several 3rd party APIs which enforce requests limiting. My Project calls these api's through Laravel Jobs. I am using using Spatie/aravel-rate-limited-job-middleware for rate limiting
Once a Project is submitted, around 60 jobs are dispatched on an average. These jobs needs to be executed as 1 Job/Minute
There is one supervisord program running 2 process of the default queue with --tries=3
also in config/queue.php
for redis I am using 'retry_after' => (60 * 15)
to avoid retrying while job is executing.
My current Rate Limiter middleware is coded this way
return (new RateLimited())
->allow(1)
->everySeconds(60)
->releaseAfterBackoff($this->attempts());
What happens is that 3 jobs get processed in 3 mins, but after that all jobs gets failed.
what I can understand is all jobs are requeued every min and once they cross tries threshold (3), they are moved to failed_jobs.
I tried removing --tries
flags but that didn't work. I also tried increasing --tries=20
, but then jobs fails after 20 mins.
I don't want to hardcode the --tries
flag as in some situation more than 100 jobs can be dispatched.
I also want to increase no of queue workers process in the supervisor so that few jobs can execute parallely.
I understand it is issue with configuring retry, timeouts flags but I don't understand how. Need Help...