0

I'm new to kafka .Here is the question i have on ever changing kafka schema.

How can we handle schema changes at kafka consumer end?

If we change the payload structure at kafka publisher end, how can I make sure nothing breaks on the kafka consumer end?

I would be interested to know industry wide practises to handle this scenario.

I won't be using Confluent's schema registry for avro. Are there any other tried and tested options?

MonkTechie
  • 33
  • 5

1 Answers1

1

Schema Registry is the solution for a centralized schema management and compatibility checks as schemas evolve Configure the schema registry into kafka producers and consumers

kafkaProducerProps.put(KafkaAvroSerializerConfig.SCHEMA_REGISTRY_URL_CONFIG,"http://localhost:8081");
kafkaConsumerProps.put(KafkaAvroDeserializerConfig.SPECIFIC_AVRO_READER_CONFIG, "true");

//Schema registry location.
kafkaConsumerProps.put(KafkaAvroDeserializerConfig.SCHEMA_REGISTRY_URL_CONFIG, "http://localhost:8081");

Run Schema Registry on 8081

Refer below URL for sample code

https://dzone.com/articles/kafka-avro-serialization-and-the-schema-registry

QuickSilver
  • 3,915
  • 2
  • 13
  • 29