Currently I am using Redis to perform the QueueJob(Priority Queue) in Python code(Odoo) and I meet the asynchronous problem (There will have some jobs doing the same task - update/delete in the same record at the same time).
As I read from Redis documentation that I need to implement a LOCK to prevent the the asynchronous problem. However, I don't why the lock does not run as I expect.
Below is my code:
import redis
import redis_lock #(python-redis-lock lib)
class PriorityQueue(object):
def __init__(self, queue_name):
...
self.redis = redis.StrictRedis(...)
self.redis_lock = redis_lock.Lock(self.redis, queue_name)
def first(self):
if self.redis_lock.acquire(blocking=False):
print("Perform task")
job = self.redis.zrevrange(self.queue_name, 0, 0)[0]
job_data = json.loads(job.decode("utf-8"))
return ChannelJob(job_data)
else:
print("Lock is used by other job")
def pop(self):
job = self.redis.zpopmax(self.queue_name, count=1)
job_data = json.loads(job.decode("utf-8"))
return ChannelJob(job_data)
The redis_lock.acquire() is always return False, Please help me to solve this problem.