1

I am sending avro data to sink topic using kafka streams in java using generic avro serde.

error stack:
org.apache.kafka.common.errors.SerializationException: Error serializing Avro message
caused by java.lang.IllegalStateException: Too many schema objects created for TestTopic!
   at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.register(CachedSchemaRegistryClient)

java code:

public Topology createTopology(String inputTopic, String outputTopic, String badRecord, String schemaRegistryUrl) {
        StreamsBuilder builder = new StreamsBuilder();
        try {
            KStream<String, String> textLines = builder.stream(inputTopic,
                    Consumed.with(Serdes.String(), Serdes.String()));
            GenericAvroSerde gserde = new GenericAvroSerde();

            gserde.configure(Collections.singletonMap(AbstractKafkaAvroSerDeConfig.SCHEMA_REGISTRY_URL_CONFIG,
                    schemaRegistryUrl), false);

            KStream<String, String>[] branches = textLines.branch(
                    (k,v) -> validateData(v) ,
                    (k,v) -> true
                    );

            branches[0].to(badRecord,Produced.with(Serdes.String(), Serdes.String()));
            branches[1].mapValues(v -> someProcessing(v, avObj.convert(v)))
                         .filterNot((k, v) -> v == null)
                         .to(outputTopic, Produced.with(Serdes.String(), gserde));

        } catch (Exception e) {
            logger.error("Error while consuming message:", e);
        }
        return builder.build();
    }

Requesting you to kindly help with the solution.

OneCricketeer
  • 179,855
  • 19
  • 132
  • 245
Amey
  • 11
  • 2

0 Answers0