Questions tagged [python-rq]

RQ is a simple, lightweight, Python library for creating background jobs, and processing them.

RQ (Redis Queue) is a simple library for queueing jobs and processing them in the background with workers. It is backed by and it is designed to have a low barrier to entry. It should be integrated in your web stack easily.

Resources

171 questions
4
votes
1 answer

How to propagate errors in python rq worker tasks to Sentry

I have a Flask app with Sentry error tracking. Now I created some tasks with rq, but their errors do not show up in Sentry Issues stream. I can tell the issues aren't filtered out, because the number of filtered issues doesn't increase. The errors…
M. Volf
  • 1,259
  • 11
  • 29
4
votes
1 answer

Do python-rq workers support multiprocessing module?

I currently have multiple python-rq workers executing jobs from a queue in parallel. Each job also utilizes the python multiprocessing module. Job execution code is simply this: from redis import Redis from rq import Queue q = Queue('calculate',…
Anubhav
  • 545
  • 3
  • 14
4
votes
0 answers

How to implement a heartbeat for an RQ worker to redis?

I am writing an app that uses RQ for jobs. The Redis instance for RQ is in Heroku, but the RQ workers are on external machines. They make a Redis connection when they start. If Heroku moves the Redis instance (which it can and does), then the…
dfrankow
  • 20,191
  • 41
  • 152
  • 214
4
votes
1 answer

how to get info about an rq worker working on a remote redis

I'm trying to run the rq info command, but I want to get info on a remote redis machine. how do I specify the url of the redis machine?
FuzzyAmi
  • 7,543
  • 6
  • 45
  • 79
4
votes
0 answers

python-rq - how do I wait for the entire queue to be handled?

I'm enqueueing several jobs to a rq.Queue and want to wait (not busy wait) until the queue is empty and all jobs were handled. Is there such an event to listen to?
ihadanny
  • 4,377
  • 7
  • 45
  • 76
4
votes
0 answers

Which docker container should have actual job function for RQ serving a web-app?

I have a DjangoREST web app that has to queue long-running tasks from time to time. I am trying to figure out the best approach for partitioning responsibilities. At the moment, I have 3 docker "containers" not counting a MySQL DB that is serving…
Steve L
  • 1,523
  • 3
  • 17
  • 24
4
votes
1 answer

ValueError: Unknown type

I develop locally on win10, which is a problem for the usage of the RQ task queue, which only works on linux systems because it requires the ability to fork processes. I'm trying to extend the flask-base project…
user1592380
  • 34,265
  • 92
  • 284
  • 515
4
votes
1 answer

Is there a way to submit functions from __main__ using Python RQ

In a similar vein to this question, is there any way to submit a function defined in the same file to python-rq? @GG_Python who asked me to create a new question for this. Usage example: # somemodule.py from redis import Redis from rq import…
sheridp
  • 1,386
  • 1
  • 11
  • 24
4
votes
2 answers

python rq worker execute tasks in parallel

I don't understand python rq that much and I just started learning about it. There is a task_a that takes 3 minutes to finish processing. @job def task_a(): time.sleep(180) print('done processing task_a') def call_3_times(): …
Minah
  • 81
  • 1
  • 9
4
votes
3 answers

How to customize log system of python rq?

Currently the logs of rq is something like this:- 15:15:03 15:15:03 *** Listening on ingest... 15:17:41 ingest: tasks.ingest_job(aws_key='SDLKASFJHJKAHAJ', aws_secret='LDFKASJKDFJKSAHKJHkjsadhfkjaKJDSAHK') So, It logs the arguments of the job…
Sheesh Mohsin
  • 1,455
  • 11
  • 28
4
votes
2 answers

python-rq worker not reading jobs in queue

I'm facing a basic issue while setting up python-rq - the rqworker doesn't seem to recognize jobs that are pushed to the queue it's listening on. Everything is run inside virtualenv I have the following code: from redis import Redis from rq…
eternalthinker
  • 532
  • 6
  • 16
4
votes
1 answer

Storing "meta" data on redis job is not working?

I'm trying to test a queued redis job but the meta data doesn't seem to be passing between the task and the originator. The job_id's appear to match so I'm a perplexed. Maybe some fresh eyes can help me work out the problem: The task is as per the…
John Mee
  • 50,179
  • 34
  • 152
  • 186
4
votes
2 answers

Redis still fills up when results_ttl=0, Why?

Question: Why is redis filling up if the results of jobs are discarded immediately? I'm using redis as a queue to create PDFs asynchronously and then save the result to my database. Since its saved, I don't need to access the object a later date and…
agconti
  • 17,780
  • 15
  • 80
  • 114
4
votes
1 answer

Reduce the number of workers on a machine in Python-RQ?

What is a good way to Reduce the number of workers on a machine in Python-RQ? According to the documentation, I need to send a SIGINT or SIGTERM command to one of the worker processes on the machine: Taking down workers If, at any time, the worker…
Chris Dutrow
  • 48,402
  • 65
  • 188
  • 258
3
votes
1 answer

rq info showing job_id (None None):?

Whenever we are trying to see queue info using rq info -u <> command, we are getting lots of extra entries like this - a331d42408099f7e5ec9c5864 (None None): ? c352af4c2385cdf320d7b74897 (None None): ? 134174815b44c44d706417eb0 (None…
Koushik Roy
  • 6,868
  • 2
  • 12
  • 33
1 2
3
11 12