0

Currently, I have a redis stream which have many entries produced. I want to process stream parallelly but make sure each entry is processed only once. I search the official document about redis stream. It seems that consumer group is one solution.

So I try to create one consumer group and let multi consumer in that group consume the same stream parallelly(maybe multi consumer instances on different servers or multi threads in the same server).

Can Redis consumer group guarantee that multi-consumers running parallelly consume different exclusive subset from the same stream to make sure each entry is processed only once? If it can be guaranteed, for each consumer, when reading from stream, is xreadgroup group mygroup consumer1 [count 1000] streams mystream > enough?

Spaceship222
  • 759
  • 10
  • 20

1 Answers1

0

Yes. Each consumer in the group reads a mutually-exclusive subset of the stream's entries. Each message will be handled by a single consumer - the one that had read it - unless it is XCLAIMed.

Only once is an entirely different matter. It is up to the consumer to make sure of that.

Itamar Haber
  • 47,336
  • 7
  • 91
  • 117
  • Even in the case that multi-consumers are running parallelly, mutually-exclusive subset is still guaranteed? – Spaceship222 Nov 22 '21 at 11:23
  • Yes, that's the reason for the consumer group functionality in the first place :) – Itamar Haber Nov 22 '21 at 15:01
  • I don't understand, you first say yes, message will be handled by a single consumer. But then you say only once is not guaranteed. If it's handled by a single consumer, how it's not only once? Maybe you mean it as such that if a single consumer fails to process it (ACK the message), then another consumer will have a go at it. Which is fine. But only 1 successful read + ACK is guaranteed then? – TondaCZE Jun 16 '23 at 12:27