5

I'm looking for a straight-forward way to run Celery on an Elastic Beanstalk environment. Does this exist, or do I need to use SQS instead?

I have tried putting a line in the the .config file without good results. This is my .config file:

container_commands:   
  01_syncdb:
    command: "django-admin.py syncdb --noinput"
    leader_only: true   
  02_collectstatic:
    command: "./manage.py collectstatic --noinput"   
  03_migrate:
    command: "./manage.py migrate --noinput"   
  04_start_celery:
    command: "./manage.py celery worker &"

When I ssh to the EC2 server and run ps -ef | grep celery it shows that Celery isn't running.

Any help appreciated. Thanks!

Krishan Gupta
  • 3,586
  • 5
  • 22
  • 31

1 Answers1

5

Celery doesn't show up because the container commands are run prior to reboot of the webserver during deployment. Basically, your celery workers get wiped out after the machine restarts.

I would suggest starting celery by using post deployment hooks.

See http://junkheap.net/blog/2013/05/20/elastic-beanstalk-post-deployment-scripts/ and How do you run a worker with AWS Elastic Beanstalk?

Victor J Wang
  • 156
  • 1
  • 5
  • Actually, I was able to get this working on Elastic Beanstalk in a much cleaner way using https://github.com/rfk/django-supervisor to run celery workers as daemons that persist through a deploy. I did have to add some post deployment hooks to ensure a fresh starting of this service though. – Victor J Wang Feb 19 '15 at 17:19