5

I am building a microservice component which will consume by default Spring Cloud Stream (SCS) Kafka messages generated by other (SCS) components.

But I also have a requirement to consume Kafka messages from other components that are using the confluent API.

I have an example repository that shows what I'm trying to do.

https://github.com/donalthurley/KafkaConsumeScsAndConfluent

This is the application configuration below with the SCS input binding and the confluent input binding.

spring:
  application:
    name: kafka
  kafka:
    consumer:
      properties.schema.registry.url: http://192.168.99.100:8081
  cloud:
    stream:
      kafka:
        binder:
          brokers: PLAINTEXT://192.168.99.100:9092
#          configuration:
#            specific:
#              avro:
#                reader: true
#            key:
#              deserializer: org.apache.kafka.common.serialization.StringDeserializer
#            value:
#              deserializer: io.confluent.kafka.serializers.KafkaAvroDeserializer

      bindings:
        inputConfluent:
          contentType: application/*+avro
          destination: confluent-destination
          group: input-confluent-group
        inputScs:
          contentType: application/*+avro
          destination: scs-destination
          group: input-scs-group

With the above configuration I get both consumers created with the SCS default configuration For instance the class org.apache.kafka.common.serialization.ByteArrayDeserializer is the value deserializer for both input bindings.

If I remove the comments in the above configuration I get both consumers with the configuration being sent from my Confluent client For instance the class io.confluent.kafka.serializers.KafkaAvroDeserializer is the value deserializer for both input bindings.

I understand because the configuration is on the Kafka binder it will apply to all the consumers defined with that binder.

Is there any way that I can define those specific properties so that they will apply for only the confluent specific consumer binding and all the other input binding can use the default SCS config?

OneCricketeer
  • 179,855
  • 19
  • 132
  • 245
Donal Hurley
  • 159
  • 2
  • 9

1 Answers1

4

You can set binding-specific consumer and producer properties via the configuration property.

See the reference manual.

spring.cloud.stream.kafka.bindings.<channelName>.consumer.configuration.foo.bar=baz

When using non-standard serializers/deserializers you must set useNativeEncoding and useNativeDecoding for producers and consumers respectively. Again, see the reference manual.

Gary Russell
  • 166,535
  • 14
  • 146
  • 179
  • I made the above change as suggested with the configuration placed under `spring.cloud.stream.kafka.bindings.inputConfluent.consumer.configuration` and setting `useNativeDecoding` and this worked. See https://github.com/donalthurley/KafkaConsumeScsAndConfluent/commit/c763a32f0a6ec0099e4fad65f662fb09b06249b6 – Donal Hurley Aug 08 '18 at 17:01
  • I think earlier when I was trying this earlier I was mixing up these two concepts [consumer properties](https://docs.spring.io/spring-cloud-stream/docs/Elmhurst.RELEASE/reference/htmlsingle/index.html#_consumer_properties) and [Kafka consumer properties](https://docs.spring.io/spring-cloud-stream/docs/Elmhurst.RELEASE/reference/htmlsingle/index.html#kafka-consumer-properties) – Donal Hurley Aug 08 '18 at 17:01
  • Right there are generic producer/consumer properties (common to all binders) and binder-specific producer/consumer properties. – Gary Russell Aug 08 '18 at 17:18