I have a DjangoREST web app that has to queue long-running tasks from time to time. I am trying to figure out the best approach for partitioning responsibilities.
At the moment, I have 3 docker "containers" not counting a MySQL DB that is serving the web app. (1) DjangoREST web app, (2) Redis-server, and (3) Rq-Worker
My plan was to have the web-app queue a task for the Rq-Worker, but the API for RQ requires that I supply the actual function as a parameter of .enqueue(), not just its name. This made me pause, since I've got the function in the code base for the Rq-Worker container. Can I place tasks on the queue by simply writing directly to the Redis DB without using the RQ API? Do I need a listener in the Rq-Worker container that listens for connections coming from the web-app and provides a proxy for .enqueue() so that the web-app can tell the Rq-Worker container what to .enqueue?
If I were to just queue a function from my web-app, and RQ tells the worker to run it, does that function have to run in the same container as my web-app? If I duplicate the code for the function so it's present in both containers will this just work? Thanks for any guidance here.