0

I'm putting together a producer/consumer setup using Kombu on Redis, but I'm running into an issue. If I start up a consumer and then launch the producer with range(10000), I can confirm that the producer has queued all 10k items, but not all 10k items are received by the consumer. Is there a limitation I'm not aware of, either with Kombu or Redis? It seems to work correctly with range(9000), and all of the keys/acks have drained properly.

class ProduceConsume(object):
    def __init__(self, exchange_name):
        exchange = Exchange(exchange_name, type='fanout', durable=False)
        self.queue_name = 'test_queue'
        self.queue = Queue(self.queue_name, exchange)

    def producer(self, inp):
        with BrokerConnection("redis://localhost:6379/15") as conn:
            with conn.SimpleQueue(self.queue) as queue:
                for payload in inp:
                    queue.put(str(payload).zfill(5))
                    print(str(payload).zfill(5))

    def consumer(self):
        with BrokerConnection("redis://localhost:6379/15") as conn:
            with conn.SimpleQueue(self.queue) as queue:
                while True:
                    message = queue.get()
                    message.ack()
                    print(message.payload)
Shookit
  • 1,162
  • 2
  • 13
  • 29

1 Answers1

0

Not a full answer, but using RabbitMQ instead of Redis did not have the message dropping issue. Possibly due to issues with non-direct exchanges? Feel free to submit a more informed answer, but hopefully this can help someone out.

Update: this is a bug in Kombu; see https://github.com/celery/kombu/issues/593

Shookit
  • 1,162
  • 2
  • 13
  • 29