2

i have situation where to achieve better performance i have to read multiple kakfka message at a time, i have search on the internet and found their is functionality of kafka called batch where we can read messages in batch the problem is that i am not able to configure it to receive only max x number of message at a time .

code that i found

await consumer.run({
    eachBatchAutoResolve: true,
    eachBatch: async ({
        batch,
        resolveOffset,
        heartbeat,
        commitOffsetsIfNecessary,
        uncommittedOffsets,
        isRunning,
        isStale,
    }) => {
        for (let message of batch.messages) {
            console.log({
                topic: batch.topic,
                partition: batch.partition,
                highWatermark: batch.highWatermark,
                message: {
                    offset: message.offset,
                    key: message.key.toString(),
                    value: message.value.toString(),
                    headers: message.headers,
                }
            })

            resolveOffset(message.offset)
            await heartbeat()
        }
    },
})

environment = Node

mohit
  • 91
  • 6
  • You can't really specify the number of messages to retrieve in a batch, but you can estimate the size of the messages, and set your consumer to retrieve a certain number of bytes at a time, which should get you close. https://stackoverflow.com/a/35630897/1630893 – Mike Gardner May 25 '22 at 14:31
  • 1
    @Mike What about `max.poll.records`? That'll set an upper bound per poll – OneCricketeer May 25 '22 at 15:12
  • @OneCricketeer That's correct, but it doesn't override the amount of data fetched from the topic at a time. That is still based on bytes min/max settings. – Mike Gardner May 25 '22 at 15:58

0 Answers0