RQ is a simple, lightweight, Python library for creating background jobs, and processing them.
RQ (Redis Queue) is a simple python library for queueing jobs and processing them in the background with workers. It is backed by redis and it is designed to have a low barrier to entry. It should be integrated in your web stack easily.
I have a long task that goes off into a python-rq worker queue.
@cache.cached(timeout=2592000)
@app.route('/as/', methods=['GET'])
@db_session
def auto_suggest(keyword):
job = q.enqueue(find_keyword, keyword)
while not job:
…
I am having issues with this setup. In summary, once the user presses submit on a form then the data is passed to an RQWorker and Redis to process.
The error from rqworker is
23:56:44 RQ worker u'rq:worker:HAFun.12371' started, version…
I have a small Python flask webserver on an Ubuntu box (ngnix and uwsgi) that I just started using to receive and process webhooks. Part of the webhook processing can include sending an email, which I noticed causes a delay and subsequently blocks…
I am trying to queue tasks for Python RQ using JavaScript. To do that I just monitored what RQ does with Redis. In particular it stores a pickled list indicating where to find the task to run including input args etc. Here I am using the example…
I am getting an authentication to MongoDb for every query I run using PyMongo MongoClient. This seems expensive / unnecessary:
2015-02-13T09:38:08.091-0800 [conn375243] authenticate db: { authenticate: 1, user: "", nonce: "xxx", key: "xxx"…
I'm using python v2.7.3 - Installed python-rq via easy_install. While trying to create RQ queue with steps given at http://python-rq.org/ . it fails with message like
>>> from redis import Redis
>>> from rq import Queue
Traceback (most recent call…
I am using Python-RQ to create a job, when you create a job you get back a job.id:
f311ae30-b623-4b38-9dcb-0edd0133a6e6
Then I use that id to check if the result is finished, which is great.
Then this result is cached for (500 seconds).
Now, here is…
when i try to use flask-rq2 in flask log this logs printed and queue doesn't working at all
13:51:26 Worker rq:worker:6cee0c3868f0476a88616b40447a73bc started with PID 295, version 1.15.1
13:51:26 Subscribing to channel…
I'm enqueing my tasks like this:
from redis import Redis
from rq import Queue, Retry
queue = Queue(connection=Redis())
queue.enqueue("path.to.my.func", retry=Retry(max=3), on_failure=my_failure)
And then on the failure callback I want to stop the…
I'm having problems using redis queue in my Ubuntu Server 22.04 (running AWS with 16GB of Ram and 4 CPU cores).
I have a web application using docker compose with Python 3.10, Redis 7.0.11, Redis Queue 1.14.0.
In my application I send a task to…
I am currently attempting to debug a function that's enqueued inside a rq queue in VS Code.
However rq forks the process to produce its workers, which I think is why it is impossible to intercept the breakpoint.
I use the debugpy as a debugging…
I've set up a test within Django that sends an email, in the background, using Django-RQ.
I call the code that enqueues the send_email task, then, I get the django-rq worker, and call .work(), with burst=True.
In my console, I can see the django-rq…
The examples provided in the Python-RQ documentation consistently show functions being enqueued, using queue.enqueue(). For example:
job = q.enqueue(count_words_at_url, 'http://nvie.com')
Is it possible to enqueue a method of a particular object…
I am running a test app on Heroku with RQ. I want to fetch the result of queuing and running function in RQ and send it over to HTML. My code as follows:
Worker.py
import os
import redis
from rq import Worker, Queue, Connection
listen = ['high',…
I'm using a task queue with python (RQ). Since workers run concurrently, without any configuration messages from all workers are mixed up.
I want to organize logging such that at any time I can get the exact full log for a given task run by a given…