0

I hosted my Django app on Heroku but due to few limitations I moved from Heroku to cloud based server. I followed this tutorial on running background tasks in Python. Everything is running fine except that I have to manually run python worker.py to start worker process.

On Heroku we can use Procfile to run processes when app starts but now I am on a cloud based server running ubuntu 14.04. So what is the alternative to Procfile?

worker.py

import os

import redis
from rq import Worker, Queue, Connection

listen = ['high', 'default', 'low']

redis_url = os.getenv('REDISTOGO_URL', 'redis://localhost:6379')

conn = redis.from_url(redis_url)

if __name__ == '__main__':
   with Connection(conn):
       worker = Worker(map(Queue, listen))
       worker.work()
Addicted
  • 149
  • 1
  • 3
  • 9

2 Answers2

2

I ended up using upstart. I created a new config file rqworker.py using sudo nano \etc\init\rqworker.conf with the following code:

description "Job queues for directory"

start on runlevel [2345]
stop on runlevel [!2345]

respawn
setuid myuser
setgid www-data

exec python3.5 worker.py

Then I just started the service sudo service rqworker start and now my worker processes are running in the background.

Addicted
  • 149
  • 1
  • 3
  • 9
1

Use a process manager like upstart, systemd or supervisor.

Daniel Roseman
  • 588,541
  • 66
  • 880
  • 895