74

I'm on Laravel using php artisan queue:listen to run queued jobs. One of these jobs is fairly involved and takes a long time, and so I'm getting the following error:

[Symfony\Component\Process\Exception\ProcessTimedOutException]                                                                                                                                                                              
The process ""/usr/local/Cellar/php55/5.5.14/bin/php" artisan queue:work  
--queue="QUEUE_URL" --delay=0 --memory=128 --sleep=3 --tries=0" 
exceeded the timeout of 60 seconds.

I know that I could run queue:listen with an arbitrarily high timeout value, but that's not ideal, as I do want it to time out in the event that some process is actually unreseponsive. I tried regularly calling set_time_limit(60) within the function called by the job, but that did not solve my problem.

I found a thread online mentioning Symfony\Component\Process\Process->setTimeout(null), but I don't know how to access that process object, or if that would even fix the issue.

Any help would be much appreciated.

Will Durney
  • 1,168
  • 2
  • 13
  • 16
  • 2
    have you tried php artisan queue:listen --timeout=120. I don't see any need to reinvent the wheel if you just need to extent the time that your queue have to run. If you need longer than 5 minutes or so though you may need to post the actual method that is handling the queue jobs. – michael.schuett Sep 16 '14 at 20:34
  • Like I said, queue:listen --timeout={number} works, but the particular task I'm running could take anywhere from a few seconds to an hour or more, and I don't want to put in a ridiculously high timeout value. – Will Durney Sep 16 '14 at 21:37
  • what causes the variance? this is an issue with how your application is structured. To help with this issue we need to see the code so we can better optimize for cases when you have to parse lots of data. This needs to be split into more jobs. – michael.schuett Sep 16 '14 at 23:21
  • 1
    The particular job causing problems is an O(n^2) algorithm running on a large input. We impose a hard limit on input size to keep it reasonable, but the truth is that it's just a process that can take a very long time. We're using a job queue to process it in the background, and when it's done, it makes an http request to our api to indicate it's been completed. Is there a better way to do something like this? Split the processing up into many jobs? That seems overly-complicated as it's a single algorithm that needs to be run on the data. I don't understand why set_time_limit doesn't work. – Will Durney Sep 17 '14 at 04:37
  • Id rewrite it in another language, as a micro-service, have it expose a HTTP API and post the data to it over HTTP then it can notify your existing application when its complete. Golang or Node sound more appropriate. PHP is really not made for that kind of heavy lifting. – AndrewMcLagan Dec 04 '16 at 22:33

4 Answers4

129

Adding --timeout=0 worked for my set up.

UPDATE: The entire command would therefore be php artisan queue:listen --timeout=0.

starball
  • 20,030
  • 7
  • 43
  • 238
David Lemayian
  • 2,679
  • 1
  • 20
  • 18
  • 34
    Side note for other users: it can be dangerous to set the timeout to 0, it can result on "invisible" infinite loop. Maybe use something "reasonable" like 120. – rap-2-h Sep 09 '15 at 15:08
  • is there any other way? @rap-2-h point is valid but my problem is, in my script I am doing very extensive work that can run for more than 24 hours. Any solution for this. – Rizh Oct 22 '16 at 09:55
  • 2
    @Rizh Then I'd say there's something wrong with your design. You should never rely on processes that need to run for so long before they do something. Break it down into smaller components. – ankush981 Jan 13 '17 at 18:15
  • Thanks, is is what I needed to run xdebug normally. Somehow, script always crashed for me... – Mārtiņš Briedis Mar 02 '17 at 16:36
  • 1
    I've been searching the source code and Laravel documentation and can't see anything that states that a `0` value on the timeout is infinite. We actually had an issue following the idea that it does and had scripts timing out on the default timeout and not getting to infinite. Can anyone point me to where in the source code or documentation it supports this theory? https://laravel.com/docs/8.x/queues#timeout – Dean Whitehouse Apr 15 '21 at 13:36
16

After investing much time I got the solution

Add below line in Job class and your job run without time out, even if you put the job in a crontab entry

public $timeout = 0;
JON
  • 965
  • 2
  • 10
  • 28
Anmol Mourya
  • 530
  • 5
  • 7
4

This is a known bug in Laravel v5.3:

You should upgrade to v5.5 to fix this problem.

Another way is hacking the source code as explained here

Mostafa Lavaei
  • 1,960
  • 1
  • 18
  • 27
  • 2
    "Hacking the source code" is never a good idea, but as this question is more than three years old and has an accepted answer, where's your point? – Nico Haase Jan 01 '18 at 09:51
  • Yes! Hacking the source code is never a good idea but its better than --timeout=0 – Mostafa Lavaei Jan 03 '18 at 04:30
  • Why should hacking other peoples source code be a better option than setting an option during call time that was created just for that purpose? – Nico Haase Jan 03 '18 at 08:09
  • 2
    Because --timeout=0 may causes an infinity process. This is not really hacking, it just an upgrade that has been done in v5.5 – Mostafa Lavaei Jan 08 '18 at 10:55
2

A queue is performed mainly for the requests which take a lot of time to get done link sending mail in bulk, import data queue jobs run in the background. it improves our web app performance too. If we don't set timeout it takes default 60 Sec. the time interval and if we set timeout as a 0, it means we set infinite timeout the request is continuously run till it won't complete. To set the timeout, we have to run this command: php artisan queue:listen --timeout=0

Here is the reference link of official document: https://laravel.com/docs/8.x/queues