My question is coming from this discussion: https://stackoverflow.com/a/74306116/3551820
I've added the proper configurations to enable batching consumption but Spring still doesn't work with batch converter to convert the headers of each message (kafka_batchConvertedHeaders).
@StreamListener(
target = "command-my-setup-input-channel"
)
public void handleMySetup(Message<List<MySetupDTO>> messages) {
//here Headers are arriving as bytes[]
List<?> batchHeaders = messages.getHeaders().get(KafkaHeaders.BATCH_CONVERTED_HEADERS, List.class);
log.warn("Messages received: {}", messages.getPayload().size()); //size of 2
The bean responsible to convert:
@Bean("batchConverter")
BatchMessageConverter batchConverter(KafkaHeaderMapper kafkaHeaderMapperCustom) {
BatchMessagingMessageConverter batchConv = new BatchMessagingMessageConverter();
batchConv.setHeaderMapper(kafkaHeaderMapperCustom);
return batchConv;
}
The configurations below:
spring.cloud.stream:
kafka:
binder:
autoCreateTopics: true
autoAddPartitions: true
healthTimeout: 10
requiredAcks: 1
minPartitionCount: 1
replicationFactor: 1
headerMapperBeanName: customHeaderMapper
bindings:
command-my-setup-input-channel:
consumer:
autoCommitOffset: false
batch-mode: true # enabling batch-mode
startOffset: earliest
resetOffsets: true
converter-bean-name: batchConverter # bean mapping
ackMode: manual
configuration:
heartbeat.interval.ms: 1000
max.poll.records: 2
max.poll.interval.ms: 890000
value.deserializer: com.xpto.MySetupDTODeserializer
bindings:
command-my-setup-input-channel:
destination: command.my.setup
content-type: application/json
binder: kafka
configuration:
value:
deserializer: com.xpto.MySetupDTODeserializer
consumer:
batch-mode: true
startOffset: earliest
resetOffsets: true
Error:
Bean named 'batchConverter' is expected to be of type 'org.springframework.kafka.support.converter.MessagingMessageConverter' but was actually of type 'org.springframework.kafka.support.converter.BatchMessagingMessageConverter'
Version: spring-cloud-stream 3.0.12.RELEASE