3

Currently I use JSON for the messages in spring kafka and that is pretty easy and works with almost no coding:

@KafkaListener(topics = "foo.t",)
public void receive(Foo payload) {
    LOG.info("received payload='{}'", payload);
    doMagic(payload);
}

@KafkaListener(topics = "bar.t",)
public void receive(Bar payload) {
    LOG.info("received payload='{}'", payload);
    doMagic(payload);
}

and little config:

# Kafka Config
spring.kafka.bootstrap-servers=broker.kafka
spring.kafka.consumer.group-id=some-app
spring.kafka.consumer.properties.value.deserializer=org.springframework.kafka.support.serializer.JsonDeserializer
spring.kafka.consumer.properties.spring.json.trusted.packages=com.example.someapp.dto

Which works (great) because the content/type information is encoded in the JSON and thus can be restored from the bytes alone. (There are issues with that as well, but it works) However in protobuf I don't have these meta-information or at least I don't know where to find them.

Question:

Is there a way to declare a generic kafka MessageConverter that works for multiple types, without throwing all the nice abstraction/auto configuration from spring out of the window?

(I would also like to use this for JSON, as encoding the content/data type of the message i the message has both some security and compatibility issues)

I would like to avoid something like this solution: https://stackoverflow.com/a/46670801/4573065 .

Alternatives

Write a message converter/ Deserializer that tries all kafka classes.

@Override
public T deserialize(String topic, byte[] data) {
    try {
        return (T) Foo.parse(data);
    } catch (Exception e) {
        try {
            return (T) Bar.parse(data);
        } catch (Exception e1) {
            throw e
        }
    }
}

However this will probably negate all performance gains I hope to get by using a binary format.

Another alternative would be to statically map topic->content type however this would still be something error prone or at least hard to make spring configure for you.


EDIT: My producer looks like this:

public void send(String message) {
    kafkaTemplate.send("string.t", message);
}
ST-DDT
  • 2,615
  • 3
  • 30
  • 51
  • 1
    you can use the `key` to determine the type, how is the producer code? – Paizo Jul 23 '18 at 11:15
  • Well they key is (de-)serialized using a different serializer/step, but there is sub interface (which is also used by jackson) to place the type info inside the header. https://github.com/apache/kafka/blob/2.0/clients/src/main/java/org/apache/kafka/common/serialization/ExtendedSerializer.java#L43 However I kind of don't like to encode application specific data (such as fqcn) in the header and thus force all producer/consumers to provide/handle them, some might not even be Java. – ST-DDT Jul 23 '18 at 13:54
  • With JSON you really can just deserialize `byte[]` to the type you want to have on the consumer side, independently from what type it was serialized on the producer side. Not sure if that is going to work the same smooth way with the Protobuf... – Artem Bilan Jul 23 '18 at 15:00
  • @ArtemBilan unfortunately this is something that i currently cannot (even in JSON) because there is no contextual information on what the consumer expects in the convert method. Thats why I have to either create a converter for each datatype I expect or use the metadata the client has sent. Neither is optimal. – ST-DDT Jul 24 '18 at 07:12
  • Well, but you don’t have choice if consumer doesn’t provide such an information. You really need to rely on the producer metadata supplied in the record or it’s headers. Not sure though why you say that headers is not optimal: that is really a way we do in many places – Artem Bilan Jul 24 '18 at 12:20

0 Answers0