RQ is a simple, lightweight, Python library for creating background jobs, and processing them.
RQ (Redis Queue) is a simple python library for queueing jobs and processing them in the background with workers. It is backed by redis and it is designed to have a low barrier to entry. It should be integrated in your web stack easily.
I am using redis and python-rq to manage a data processing task. I wish to distribute the data processing across multiple servers (each server would manage several rq workers) but I would like to keep a unique queue on a master server.
Is there a…
I have somehow bad setup of my docker containers I guess.
Because each time I run task from django I see in docker container output of ps aux that there is new process created of python mange.py rqworker mail instead of using the existing one.
See…
My redis server is running under ubuntu 16.04 and I have RQ Dashboard running to monitor the queue. The redis server has a password which I supply for the initial connection. Here's my code:
from rq import Queue, Connection, Worker
from redis…
I have two kinds of jobs: ones that I want to run in serial and ones that I want to run concurrently in parallel. However I want the parallel jobs to get scheduled in serial (if you're still following). That is:
Do A.
Wait for A, do B.
Wait for B,…
My rq tasks are running correctly, but none of the functions that get all the jobs works --
$ pip3 freeze | egrep -i "rq|redis"
redis==2.10.6
rq==0.12.0
$ flask shell
Python 3.6.5 (default, Apr 1 2018, 05:46:30)
[GCC 7.3.0] on linux
App: app…
We're currently using Flask RQ along with Flask SQLAlchemy and running into some performance problems. Here's our high level architecture:
API endpoint is hit
Time consuming tasks get queued into RQ
RQ worker forks a new process to perform the…
I'm trying to connect a RQ worker to a Redis server on a Unix domain socket.
I've tried the following
$ rq worker --url '/path/to/redis.sock'
Error 111 connecting to None:6379. Connection refused.
$ rq worker --url…
I want to log using Stackdriver Logging to App Engine using Redis queue. So I'm using Redis Server, Redis Queue and Python logging to do this. Here's my code:
import logging
from redis import Redis
from rq import Queue
import time
class…
I was wondering if there's a way running tasks asynchronously that will run in the background (Using Celery for example) which will never run simultaneously?
Which means, each task can run by itself simultaneously with it self but not with another…
I'm building a distributed crawling mechanism and want to make sure that no more than 30 requests are made to the server in one minute. Each enqued task makes a request.
All tasks are enqued in redis and are dequed using api provided by…
I would like to have a computational simulation running on a background process (started with redis rq) where I can query its current state, as well as change parameters using Django.
For the sake of simplicity: let's say I want to run the following…
I'm using Django-RQ in a Heroku application to handle background tasks.
When an error occurs in my background tasks, it doesn't get sent to Sentry.
My logging settings in settings.py are below:
LOGGING = {
'version': 1,
…
I'm trying to poll to see the status of a youtube-dl job. I am having trouble figuring out how to get this to work.
The following is my python-rq worker.py file
class MyLogger(object):
def debug(self, msg):
pass
def warning(self,…
I am looking to run queued jobs using RQ, but looking at the following example:
from rq import Queue
from redis import Redis
from somewhere import count_words_at_url
# Tell RQ what Redis connection to use
redis_conn = Redis()
q =…