I would like to implement a queue in redis + Flask and Python. I already implemented a query like this with RQ and it works fine if you have the Flask app and the task working on the same server. I am wondering if it is possible to create a queue (multi-consumer) where the worker(s) are actually on an other server. For example:
Client post data to Flask -> Flask create an item in the Redis queue -> Redis queue is picked up by some workers on an other server (backend).
Since the code in Flask is something like:
redis_conn = Redis()
q = Queue('my_queue', connection=redis_conn)
job = q.enqueue_call(func='myqueue.myfunc', args=(json,), result_ttl=5000)
Obviously 'myqueue.myfunc' needs to stay on the Flask server but I would like to be able to push the data and have a worker on another server. Do you know if this is feasible or what are other things that can be used to solve this problem?
Thanks.