1

I have a.NET application which is writing a JSON message to Kafka TOPIC using the Confluent JSON schema aware serializer.

I have another spring boot JAVA application which is reading from that topic and creating a KTABLE and grouping by Keys and writing the value as JSON to another output TOPIC.

The code for spring boot JAVA is as follows:

application.properties

server.port=9081
#stream config
spring.kafka.streams.bootstrap-servers=localhost:9092
spring.kafka.streams.properties.schema.registry.url=http://localhost:8081
spring.kafka.streams.application-id=KTable-aggregations
spring.kafka.streams.properties.group-id=stream-group
spring.kafka.streams.replication-factor=1
spring.kafka.streams.properties.commit.interval.ms=30000
spring.kafka.streams.properties.cache.max.bytes.buffering=10485760
spring.kafka.streams.properties.metadata.max.age.ms=10000
spring.kafka.streams.properties.num.stream.threads=1
spring.kafka.streams.properties.default.key.serde=org.apache.kafka.common.serialization.Serdes$StringSerde
spring.kafka.streams.properties.default.value.serde=io.confluent.kafka.streams.serdes.json.KafkaJsonSchemaSerde
spring.kafka.streams.properties.spring.json.value.default.type=com.kafkastreams.aggregate.model.Output
spring.kafka.streams.properties.default.deserialization.exception.handler=org.apache.kafka.streams.errors.LogAndContinueExceptionHandler
spring.kafka.streams.properties.compression-type=gzip
#spring.kafka.streams.properties.state.dir=/var/lib/kafka-streams
#spring.kafka.streams.properties.state.cleanup.delay.ms=600000
spring.kafka.streams.properties.auto-offset-reset=latest
spring.kafka.streams.properties.timestamp.extractor=org.apache.kafka.streams.processor.WallclockTimestampExtractor

#default queue
kafka.topic.input=input-topic
kafka.topic.output=output-topic

myAPI.java

@JsonInclude(JsonInclude.Include.NON_NULL)
@JsonPropertyOrder({
    "id",
    "name",
    "address",
})
public class MyAPI{

    @JsonProperty("id")
    private long id;
    @JsonProperty("name")
    private String name;
    @JsonProperty("address")
    private String address;

    @JsonProperty("id")
    public long getId() {
        return id;
    }

    @JsonProperty("id")
    public void setId(long id) {
        this.id = id;
    }

    @JsonProperty("name")
    public String getName() {
        return name;
    }

    @JsonProperty("name")
    public void setName(String name) {
        this.name = name;
    }

    @JsonProperty("address")
    public String getAddress() {
        return address;
    }

    @JsonProperty("address")
    public void setAddress(String address) {
        this.address = address;
}

Output.java

@JsonPropertyOrder({
    "id",
    "name",
    "address",
})
public class Output{

    @JsonProperty("id")
    private long id;
    @JsonProperty("name")
    private String name;


    @JsonProperty("id")
    public long getId() {
        return id;
    }

    @JsonProperty("id")
    public void setId(long id) {
        this.id = id;
    }

    @JsonProperty("name")
    public String getName() {
        return name;
    }

    @JsonProperty("name")
    public void setName(String name) {
        this.name = name;
    }

}

Processor.java

@Configuration
@EnableKafka
@EnableKafkaStreams
public class KafkaStreamConfig {
    private final static Logger logger = LoggerFactory.getLogger(KafkaStreamConfig.class);
    @Autowired
    private KafkaProperties kafkaProperties;

    @Value("${kafka.topic.input}")
    private String inputTopic;

    @Value("${kafka.topic.output}")
    private String outputTopic;

    @Bean 
    public KStream<String, Api> kStream(StreamsBuilder kStreamBuilder) 
    {
        Map<String, String> serdeConfig = Collections.singletonMap("schema.registry.url",
                "http://localhost:8081");
    
          Serde<MyApi> myapiSerde = new KafkaJsonSchemaSerde<>(); //Not sure if this is the right way of creating a Serde
          myapiSerde.configure(serdeConfig, false);
          Serde<Output> outputSerde = new KafkaJsonSchemaSerde<>();//Not sure if this is the right way of creating a Serde
          outputSerde.configure(serdeConfig, false);
          Serde<String> stringSerde = Serdes.String();

          
          KTable<String, Output> cdcTbl = kStreamBuilder.stream(inputTopic,
          Consumed.with(stringSerde,myapiSerde ) .withOffsetResetPolicy(EARLIEST))
          .mapValues(v -> Api.newBuilder(v).build()) .groupByKey() .reduce((aggValue,
          newValue) -> newValue /*adder*/ );
          
          cdcTbl.toStream().peek((k, v) -> logger.info("API Topic key and value {} {}",
          k, v)) .to(outputTopic, Produced.with(stringSerde, outputSerde));
          
          return kStreamBuilder.stream(outputTopic); 
     }


}

I am getting the following error message.

*Caused by: org.apache.kafka.streams.errors.StreamsException: ClassCastException invoking Processor. Do the Processor's input types match the deserialized types? Check the Serde setup and change the default Serdes in StreamConfig or provide correct Serdes via method parameters. Make sure the Processor can accept the deserialized input of type key: java.lang.String, and value: java.util.LinkedHashMap. Note that although incorrect Serdes are a common cause of error, the cast exception might have another cause (in user code, for example). For example, if a processor wires in a store, but casts the generics incorrectly, a class cast exception could be raised during processing, but the cause would not be wrong Serdes.

Caused by: java.lang.ClassCastException: java.util.LinkedHashMap cannot be cast to MyApi class*

Please help. Thanks a lot in advance. Please let me know, if you need more details.

OneCricketeer
  • 179,855
  • 19
  • 132
  • 245
NARASIMHA MURTHY
  • 105
  • 1
  • 13
  • I think you'll need to configure the Serde to know about your specific class. Otherwise, it only would know about `LinkedHashMap`, similar to the Avro serde would be a GenericRecord – OneCricketeer Sep 28 '20 at 18:54
  • Hi, Please let me how to configure Confluent JSON Serde like Avro Serde. – NARASIMHA MURTHY Sep 29 '20 at 21:19
  • In your SerdeConfig, you'll need to add a pair for `KafkaJsonSchemaDeserializerConfig.JSON_VALUE_TYPE` set to `Output.class` – OneCricketeer Sep 29 '20 at 21:38
  • I had added this in the application.properties file spring.kafka.streams.properties.default.key.serde=org.apache.kafka.common.serialization.Serdes$StringSerde spring.kafka.streams.properties.default.value.serde=io.confluent.kafka.streams.serdes.json.KafkaJsonSchemaSerde spring.kafka.streams.properties.spring.json.value.default.type=com.kafkastreams.aggregate.model.Output – NARASIMHA MURTHY Sep 29 '20 at 21:46
  • Please correct my application.properties config.......values... – NARASIMHA MURTHY Sep 29 '20 at 21:48
  • 1
    Do you need that file? In your code, you have serdeConfig that needs more values than just the registry... The properties for `spring.json.value.default` doesn't use the Confluent Schema Registry and uses the Spring JSON deserializer – OneCricketeer Sep 30 '20 at 02:58
  • The global config from Spring won't be passed into the `Serde` because you create the `Serde` manually via `new` and thus it's your responsibility to configure it correctly. You would need to add the parameter to the `serdeConfig` object you pass to `configure()` -- as an alternative, try calling `KafkaJsonSchemaSerde(Class specificClass)` instead of the parameter free constructor to set the correct type. – Matthias J. Sax Jan 24 '21 at 20:14

0 Answers0