3

I have celery running on a server with my Django app, the app is listening for data being posted and once data is posted it tosses that data to a celery task. In my django admin when a lot of data is sent the tasks end up failing by giving the error {"exc_type": "WorkerLostError", "exc_message": ["Worker exited prematurely: signal 9 (SIGKILL) Job: 134."], "exc_module": "billiard.exceptions"}

I cannot figure out how to fix this, any help would be much appreciated

Ali_Rashid
  • 33
  • 7

1 Answers1

0

I use Redis to store the tasks and Celery to process them

Updating Celery and Redis helped me to solve the issue

I updated Celery with the command pip install -U Celery and Redis as per documentation

Also, the interesting thing to note here is that I had to reinstall Redis and not just update it. The reason is that there was a new major version available. And updating does not switch to the newest major version if it differs from the current one. This is due to the major versions introducing breaking changes. Refer to SemVer for more details.