I'm using a PollableMessageSource
input to read from a Kafka topic. Messages on that topic are in Avro. use-native-decoding
was set to true when those messages were published.
This is how I'm polling:
pollableChannels.inputChannel().poll(this::processorMethodName,
new ParameterizedTypeReference<TypeClassName>() {
});
pollableChannels
is just an injected instance of this interface:
public interface PollableChannels {
@Input("input-channel")
PollableMessageSource inputChannel();
}
After seeing that the TypeClassName
is not being formed properly (it's nested objects are set to null by mistake), I started debugging the poll
method and I found that it's relying on the contentType
header to select a converter, and since this has not been set (because the messages have been encoded natively), it's falling back to using the ApplicationJsonMessageMarshallingConverter
which is clearly not the right option.
If I use a regular streamListener, the use-native-decoding
config property is honored fine, so the messages seem to be published ok.
Therefore, my primary question here is how to force native decoding when using pollable consumers?
My borader question could be asking if properties under spring.cloud.stream.bindings.channel-name.consumer
are respected at all when using a pollable consumer?
Spring cloud stream version: 2.2.0.RELEASE
Spring Kafka: 2.2.5.RELEASE
Confluent version for the serializer: 5.2.1
Update:
Relevant config:
spring:
cloud.stream:
bindings:
input-channel:
content-type: application/*+avro
destination: "topic-name"
group: "group-name"
consumer:
partitioned: true
concurrency: 3
max-attempts: 1
use-native-decoding: true
kafka:
binder:
configuration:
key.serializer: org.apache.kafka.common.serialization.StringSerializer
value.serializer: io.confluent.kafka.serializers.KafkaAvroSerializer
key.deserializer: org.apache.kafka.common.serialization.StringDeserializer
value.deserializer: io.confluent.kafka.serializers.KafkaAvroDeserializer