Currently, I have a redis stream which have many entries produced. I want to process stream parallelly but make sure each entry is processed only once. I search the official document about redis stream. It seems that consumer group is one solution.
So I try to create one consumer group and let multi consumer in that group consume the same stream parallelly(maybe multi consumer instances on different servers or multi threads in the same server).
Can Redis consumer group guarantee that multi-consumers running parallelly consume different exclusive subset from the same stream to make sure each entry is processed only once?
If it can be guaranteed, for each consumer, when reading from stream, is xreadgroup group mygroup consumer1 [count 1000] streams mystream >
enough?