4

I have a DjangoREST web app that has to queue long-running tasks from time to time. I am trying to figure out the best approach for partitioning responsibilities.

At the moment, I have 3 docker "containers" not counting a MySQL DB that is serving the web app. (1) DjangoREST web app, (2) Redis-server, and (3) Rq-Worker

My plan was to have the web-app queue a task for the Rq-Worker, but the API for RQ requires that I supply the actual function as a parameter of .enqueue(), not just its name. This made me pause, since I've got the function in the code base for the Rq-Worker container. Can I place tasks on the queue by simply writing directly to the Redis DB without using the RQ API? Do I need a listener in the Rq-Worker container that listens for connections coming from the web-app and provides a proxy for .enqueue() so that the web-app can tell the Rq-Worker container what to .enqueue?

If I were to just queue a function from my web-app, and RQ tells the worker to run it, does that function have to run in the same container as my web-app? If I duplicate the code for the function so it's present in both containers will this just work? Thanks for any guidance here.

Steve L
  • 1,523
  • 3
  • 17
  • 24
  • Some potential solutions have been discussed [here](https://serverfault.com/questions/706736/sharing-code-base-between-docker-containers) – rvernica Mar 21 '19 at 17:59
  • 1
    You can actually pass functions by name to `enqueue()`, so you can say `enqueue('builtins.max', 1, 2)` instead of `enqueue(max, 1, 2)`. There is an example in de [docs](http://python-rq.org/docs/) as well. – rvernica Mar 21 '19 at 18:11
  • Thanks @rvernica -- will give that a try. – Steve L Apr 14 '19 at 03:25

0 Answers0