1

I have application in laravel that send several email and some of this email have to wait some time to be sent.

So I'm using the queue database type and in localhost I run the command php artisan schedule:run that runs this command:

$schedule->command('queue:work')->everyMinute();

and works perfectly.

Now I pass the project to a cpanel shared hosting and to run the schedule command I create a cron job that do that.

/usr/local/bin/php /path to project/artisan schedule:run

As I need to be always watching if I need to send an email I define run a cron job each minute and works in first 5 or 10 minutes.

Next I start to receive a 503 error from server because I arrive to the lime of processes probably because the cron job execution. And right now the server will be down for 24hours.

How can I solve that? What is the better solution for this?

Thank you

user3242861
  • 1,839
  • 12
  • 48
  • 93
  • Shared host probably doesn't have the correct permissions / access to run any of those commands. Stop using shared hosting. – Kaylined Sep 27 '18 at 22:10

2 Answers2

2

I use shared hosting and had a similar issue. If your hosting service accepts the php command shell_exec() you could do this.

protected function schedule(Schedule $schedule)
    {
        if (!strstr(shell_exec('ps xf'), 'php artisan queue:work'))
        {
            $schedule->command('queue:work --timeout=60 --tries=1')->everyMinute();
        }
    }

Your cron job seems ok. By the way, if your hosting server is 24h down, you may consider another host my friend.

queue:work is a long running process. This check ensures it's running on your server. It will listens to your queue and does the job. It also means that if you make changes to your production files, the worker will not pick the changes up. Have a look at my top -ac

    PID USER      PR  NI  VIRT  RES  SHR S %CPU %MEM    TIME+  COMMAND
2398733 user  20   0  466m  33m  12m S  0.0  0.1   0:03.15 /opt/alt/php72/usr/bin/php artisan queue:work --timeout=60 --tries=1
2397359 user  20   0  464m  33m  12m S  0.0  0.1   0:03.04 /usr/local/bin/php /home/user/booklet/artisan schedule:run
2398732 user  20   0  105m 1308 1136 S  0.0  0.0   0:00.00 sh -c '/opt/alt/php72/usr/bin/php' 'artisan' queue:work --timeout=60 --tries=1 >> '/home/user/booklet/storage/queue.log' 2>&1

As you can see, the worker is on top, another process simply writes everything it does to a log file. You have to kill 2398733 after making new uploads/changes to your prod server. The process will restart by itself in less than 5 minutes. Because of the schedule:run cron job.

Update October 2019

    protected function schedule(Schedule $schedule)
    {
        if (!strstr(shell_exec('ps xf'), 'php artisan queue:work'))
        {
            $schedule->command('queue:work --timeout=60 --tries=1')->withoutOverlapping();
        }
    }

The ->withoutOverlapping() method pushes the process command in the background. It ensures that the artisan Schedule command exits properly.

Dimitri Mostrey
  • 2,302
  • 1
  • 12
  • 11
  • So, you are telling me to maintain the cron job and add the shell_exec() to my schedule function in kernel file? @DimitriMostrey – user3242861 Sep 28 '18 at 09:18
  • Yes. I had the same problem like you on my shared server. This workaround solved it. – Dimitri Mostrey Sep 28 '18 at 09:20
  • ok, another question about that. Imagine that I have 5 email in queue and third email have to wait 3 minutes to be send. With this code it will send the 4 emails first right? @DimitriMostrey – user3242861 Sep 28 '18 at 09:23
  • Yes. `queue:work` is a long running process. This little check ensures it's running on the server. You can't really run this process from a ssh batch command. If you stop the command (ctrl+c), close your connection or batch program, so will `queue:work`. This is also the reason why you can't run it as a cron job. It will timeout and autokill the process. – Dimitri Mostrey Sep 28 '18 at 12:15
0

You can prevent this from happening with withoutOverlapping on the cron task.

By default, scheduled tasks will be run even if the previous instance of the task is still running. To prevent this, you may use the withoutOverlapping method:

$schedule->command('emails:send')->withoutOverlapping();

https://laravel.com/docs/5.7/scheduling#preventing-task-overlaps

This way, your cron will restart the queue:work task if it fails for some reason, but it won't fire up multiple instances of it.

Community
  • 1
  • 1
ceejayoz
  • 176,543
  • 40
  • 303
  • 368