9

I am using RQ with flask for queuing jobs in loop. I have the following code :

from rq import Queue
from rq.job import Job
from worker import conn

q = Queue(connection=conn)

for i in range(5):
    job = q.enqueue_call(
    func=process_data, args=(i, data,))
    print(job.get_id()) 

Now I am getting the error :

TypeError: cannot pickle '_thread.lock' object

I have the worker with following code :

import os

import redis
from rq import Worker, Queue, Connection

listen = ['default']

redis_url = os.getenv('REDISTOGO_URL', 'redis://localhost:6379')

conn = redis.from_url(redis_url)

if __name__ == '__main__':
    with Connection(conn):
        worker = Worker(list(map(Queue, listen)))
        worker.work()

How this can be corrected ?

Happy Coder
  • 4,255
  • 13
  • 75
  • 152

2 Answers2

3

I solved a similar problem by downgrading from Python 3.8 to Python 3.7

My situation was a little different. I am running a Django server, which schedules tasks using Django-Q. However, Django-Q is based on RQ, and the error

TypeError: cannot pickle '_thread.lock' object

is thrown by Python's Multiprocessing module, so I believe the solution will translate.

As of May 2020, I expect that this is a bug, although it is unclear what causes it.

AJ Koenig
  • 91
  • 1
  • 3
0
TypeError: cannot pickle '_thread.lock' object

This error mainly originates when you try to serialize a non-serializable object using pickle. You can check the same by creating an object for Redis() and try to do this:

import pickle
r = Redis()
pickle.dumps(r)

It will give you same error. Also, for your current scenario. This is coming may be from

q = Queue(connection=conn)

As, you are trying to import a connection conn. You can define something like this ::

q = Queue(connection=Redis())
r = Redis(host='localhost', port=6379, db=0)