3

I recently learned that the multiprocessing package Pool with python requires you to call:

pool.close()
pool.join()

when you're finished in order to free the memory used for state in those processes. Otherwise, they persist and your computer will fill up with python jobs; they won't use the CPU, but will hog memory.

My question is this: I'm now using celery for parallelization (instead of Pool-- I'm in operating within a Django WSGI app, and Pool makes it difficult to prevent all users from forking jobs at once, which will crash the server).

I've noticed a very similar phenomenon with celery: my 6 celery processes running in the background start to gobble up memory. Is this normal or is there the equivalent of calling close() and join() that I can use to tell a celery task that I've retrieved the result and don't need it anymore?

Thanks a lot for the help.
Oliver

Community
  • 1
  • 1
user
  • 7,123
  • 7
  • 48
  • 90

1 Answers1

1

As referring to documentation there's no need to close workers as they will be closed automatically. But you can force close them using kill or discard them:

from celery.task.control import discard_all
discard_all()

Here are some examples from developers.

insider
  • 1,818
  • 2
  • 17
  • 15