RQ is a simple, lightweight, Python library for creating background jobs, and processing them.
RQ (Redis Queue) is a simple python library for queueing jobs and processing them in the background with workers. It is backed by redis and it is designed to have a low barrier to entry. It should be integrated in your web stack easily.
I have a nested job structure in my python redis queue. First the rncopy job is executed. Once this is finished the 3 dependant registration jobs follow. When the computation of all these 3 jobs is finished I want to trigger a job to send a…
Trying to find a good way to catch a timeout of an RQ job, so it can be requeued after the timeout.
Basically, the correct solution would provide a way (for example, an exception handler in the worker or something of the sort) to requeue the job…
I feel a bit stupid for asking, but it doesn't appear to be in the documentation for RQ. I have a 'failed' queue with thousands of items in it and I want to clear it using the Django admin interface. The admin interface lists them and allows me to…
I followed the excellent Flask Mega Tutorial by Miguel Grinberg and have successfully setup a Flask web app with a Redis task queue and RQ workers, all in Docker containers.
To improve task queue performance, I now need to use my own custom worker,…
Due to having trouble with the reliability and scheduling in celery we decided to evaluate alternatives. I have been struggling setup a benchmark between the two message queue solutions with regards to base performance.
My current approach is to…
I am writing a flask app that asks user to upload excel spreadsheet and then calculate and populate the database.I am trying to do the processing part in the background via Redis RQ but I keep getting TypeError: cannot serialize '_io.TextIOWrapper'…
I have now a big number of documents to process and am using Python RQ to parallelize the task.
I would like a pipeline of work to be done as different operations is performed on each document. For example: A -> B -> C means pass the document to…
I need to start multiple workers for only one of my queues (the "high" priority one below). How can I do this in the context of a worker script that I am using to start my workers?
from config import Config
from redis import from_url
from rq import…
I have a web service (Python 3.7, Flask 1.0.2) with a workflow consisting of 3 steps:
Step 1: Submitting a remote compute job to a commercial queuing system (IBM's LSF)
Step 2: Polling every 61 seconds for the remote compute job status (61 seconds…
I want to use rq to run tasks on a separate worker to gather data from a measuring instrument. The end of the task will be signaled by a user pressing a button on a dash app.
The problem is that the task itself does not know when to terminate since…
My first question/post ... please be kind....
I am working on a personal project where one module runs in a loop gathering data. When data comes in, it hands off the insertion of the data into a database to a function on a queue where a listening rq…
I am running a Flask server which loads data into a MongoDB database. Since there is a large amount of data, and this takes a long time, I want to do this via a background job.
I am using Redis as the message broker and Python-rq to implement the…
So, RQ explicitly states I can enqueue an instance method of an object here, so I've been trying to do that, but getting a PicklingError:
q.enqueue(some_obj.some_func, some_data)
*** PicklingError: Can't pickle : attribute lookup…
All, I'm attempting to 'force' RQ workers to perform concurrently using supervisord. My setup supervisord setup seems to work fine, as rq-dashboard is showing 3 workers, 3 PID's and 3 queue (one for each worker/PID). Supervisord setup is as follows…
I've started using RQ / Redis to build out some asynchronous execution of some long running jobs for my django site. I'm hoping to do something like the following:
I want one queue for each instance of a model. You can think of this model like an…