0

I am running my Django app in a load balanced Elastic Beanstalk Environment. I want to add a Celery daemon process to do the following things:

  • Upload files to S3 in background and send a success response to my Android app
  • Send SMS to users to notify them about their upcoming EMIs (using celery beat)
  • My app uses Google Cloud vision for some features, that takes 10sec to run, so I can run that in the background

Now, I want to know if it is the right way to deploy celery on the same server as Django is running using Amazon SQS? If yes, how do I set that up?

And if multiple servers on Elastic Beanstalk can cause duplicate tasks because of celery beat?

1 Answers1

3

It doesn't matter where you will start your celery: on the same server or on the separate, the both ways are right. Does matter what will you use for the celery backend. If you use some shared redis or database between all celeries than there is no chance that tasks will duplicate but if every celery has its own backend that it will chaos and disaster.

vZ10
  • 2,468
  • 2
  • 23
  • 33
  • thanks for the answer. My concern is celery beat that will be designed to check for entries in database (AWS RDS instance) that belong to that date and send them notifications based on that. I feel if multiple instances of celery beat are running, it can cause duplicate notifications. I am using AWS SQS for message queuing, – Joy Lal Chattaraj Jun 24 '17 at 09:49
  • It will work ok with SQS and won't duplicate messages if you don't forget to delete messages after "execute" and give the message appropriate "repeat time" (I don't how it names correctly in AWS) – vZ10 Jun 24 '17 at 11:48