4

I have a User class and I am seralizing it as avro, (using Confluent avro serializer and schema registry) and publish it to a Kafka topic. I made a consumer to print data to console and it works fine. What I am trying now is to create the original object from this data. For example, I am publishing "User" object as avro to Kafka topic. I am trying to recreate that user object (instead of console output) after consuming it. Is this possible?

Below is my code

User class

public class User { 
    int id;
    String name;    
    public User(){} 
    public User(int id, String name) {
        super();
        this.id = id;
        this.name = name;
    }

    public int getId() {
        return id;
    }

    public void setId(int id) {
        this.id = id;
    }

    public String getName() {
        return name;
    }

    public void setName(String name) {
        this.name = name;
    }   
}

Consumer code

User user = new User();



Properties props = new Properties();

props.put("bootstrap.servers", "127.0.0.1:9092");

props.put("group.id", "avro-consumer-group");

props.put("key.deserializer", io.confluent.kafka.serializers.KafkaAvroDeserializer.class);

props.put("value.deserializer", io.confluent.kafka.serializers.KafkaAvroDeserializer.class);

props.put("schema.registry.url","http://127.0.0.1:8081");

KafkaConsumer<String, GenericRecord> consumer = new KafkaConsumer<String, GenericRecord>(props);



consumer.subscribe(Arrays.asList("avrotesttopic"));

System.out.println("Subscribed to topic " + "avrotesttopic");



while (true) {

    ConsumerRecords<String, GenericRecord> records = consumer.poll(100);

    for (org.apache.kafka.clients.consumer.ConsumerRecord<String, GenericRecord> record : records){

        System.out.printf("value = %sn",record.value());    

        //output->  value = {"id": 10, "name": "testName"}

    }

}

Thanks

OneCricketeer
  • 179,855
  • 19
  • 132
  • 245
Alfred
  • 21,058
  • 61
  • 167
  • 249

3 Answers3

4

Considering that you are using KafkaAvroDeserializer, you will need to set the folloing property as part of your consumer configuration

    props.put(KafkaAvroDeserializerConfig.SPECIFIC_AVRO_READER_CONFIG, "true");

it will allow you to get the SpecificRecord instead of GenericRecord already handled by the Consumer. Here is an example

https://dzone.com/articles/kafka-avro-serialization-and-the-schema-registry

hlagos
  • 7,690
  • 3
  • 23
  • 41
0

You can convert AvroSchema to JsonSchema and then from JsonSchema you can create a POJO Class.

Convert AVRO to JSON :

static void convertAvroToJson(InputStream inputStream, OutputStream outputStream, Schema schema)
        throws IOException {
    DatumReader<Object> reader = new GenericDatumReader<>(schema);
    DatumWriter<Object> writer = new GenericDatumWriter<>(schema);

    BinaryDecoder binaryDecoder = DecoderFactory.get().binaryDecoder(inputStream, null);

    JsonEncoder jsonEncoder = EncoderFactory.get().jsonEncoder(schema, outputStream, true);
    Object datum = null;
    while (!binaryDecoder.isEnd()) {
        datum = reader.read(datum, binaryDecoder);
        writer.write(datum, jsonEncoder);
        jsonEncoder.flush();
    }
    outputStream.flush();
}

And then Convert JSONSchema to POJO ,

https://dzone.com/articles/converting-json-to-pojos-using-java

ideano1
  • 140
  • 1
  • 10
-1

It depends on how you are serializing the input records to Avro.

Short answer if your serialization is ok then you have to make below change while receiving records

ConsumerRecords<String, User> records = consumer.poll(100);

For full example see this

asolanki
  • 1,333
  • 11
  • 18