RQ is a simple, lightweight, Python library for creating background jobs, and processing them.
RQ (Redis Queue) is a simple python library for queueing jobs and processing them in the background with workers. It is backed by redis and it is designed to have a low barrier to entry. It should be integrated in your web stack easily.
I have a Flask app with Sentry error tracking. Now I created some tasks with rq, but their errors do not show up in Sentry Issues stream. I can tell the issues aren't filtered out, because the number of filtered issues doesn't increase. The errors…
I currently have multiple python-rq workers executing jobs from a queue in parallel. Each job also utilizes the python multiprocessing module.
Job execution code is simply this:
from redis import Redis
from rq import Queue
q = Queue('calculate',…
I am writing an app that uses RQ for jobs. The Redis instance for RQ is in Heroku, but the RQ workers are on external machines. They make a Redis connection when they start.
If Heroku moves the Redis instance (which it can and does), then the…
I'm enqueueing several jobs to a rq.Queue and want to wait (not busy wait) until the queue is empty and all jobs were handled. Is there such an event to listen to?
I have a DjangoREST web app that has to queue long-running tasks from time to time. I am trying to figure out the best approach for partitioning responsibilities.
At the moment, I have 3 docker "containers" not counting a MySQL DB that is serving…
I develop locally on win10, which is a problem for the usage of the RQ task queue, which only works on linux systems because it requires the ability to fork processes. I'm trying to extend the flask-base project…
In a similar vein to this question, is there any way to submit a function defined in the same file to python-rq? @GG_Python who asked me to create a new question for this.
Usage example:
# somemodule.py
from redis import Redis
from rq import…
I don't understand python rq that much and I just started learning about it.
There is a task_a that takes 3 minutes to finish processing.
@job
def task_a():
time.sleep(180)
print('done processing task_a')
def call_3_times():
…
Currently the logs of rq is something like this:-
15:15:03
15:15:03 *** Listening on ingest...
15:17:41 ingest: tasks.ingest_job(aws_key='SDLKASFJHJKAHAJ', aws_secret='LDFKASJKDFJKSAHKJHkjsadhfkjaKJDSAHK')
So, It logs the arguments of the job…
I'm facing a basic issue while setting up python-rq - the rqworker doesn't seem to recognize jobs that are pushed to the queue it's listening on.
Everything is run inside virtualenv
I have the following code:
from redis import Redis
from rq…
I'm trying to test a queued redis job but the meta data doesn't seem to be passing between the task and the originator. The job_id's appear to match so I'm a perplexed. Maybe some fresh eyes can help me work out the problem:
The task is as per the…
Question: Why is redis filling up if the results of jobs are discarded immediately?
I'm using redis as a queue to create PDFs asynchronously and then save the result to my database. Since its saved, I don't need to access the object a later date and…
What is a good way to Reduce the number of workers on a machine in Python-RQ?
According to the documentation, I need to send a SIGINT or SIGTERM command to one of the worker processes on the machine:
Taking down workers
If, at any time, the worker…
Whenever we are trying to see queue info using rq info -u <> command, we are getting lots of extra entries like this -
a331d42408099f7e5ec9c5864 (None None): ?
c352af4c2385cdf320d7b74897 (None None): ?
134174815b44c44d706417eb0 (None…