3

I dont see an example of how to use camel-avro component to produce and consume kafka avro messages? Currently my camel route is this. what should it be changed in order to work with schema-registry and other props like this using camel-kafka-avro consumer & producer.

props.put(AbstractKafkaAvroSerDeConfig.SCHEMA_REGISTRY_URL_CONFIG, "http://localhost:8081");
props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, KafkaAvroDeserializer.class);              
props.put(KafkaAvroDeserializerConfig.SPECIFIC_AVRO_READER_CONFIG, true); 

public void configure() {
        PropertiesComponent pc = getContext().getComponent("properties", PropertiesComponent.class); 
        pc.setLocation("classpath:application.properties");

        log.info("About to start route: Kafka Server -> Log ");

        from("kafka:{{consumer.topic}}?brokers={{kafka.host}}:{{kafka.port}}"
                + "&maxPollRecords={{consumer.maxPollRecords}}"
                + "&consumersCount={{consumer.consumersCount}}"
                + "&seekTo={{consumer.seekTo}}"
                + "&groupId={{consumer.group}}"
                +"&valueDeserializer="+KafkaAvroDeserializer.class
                +"&keyDeserializer="+StringDeserializer.class
                )
                .routeId("FromKafka")
            .log("${body}");
OneCricketeer
  • 179,855
  • 19
  • 132
  • 245
driven_spider
  • 495
  • 4
  • 16
  • Why is this down voted? could they please explain? even with thsi config -> it doesnt work value.deserializer = class io.confluent.kafka.serializers.KafkaAvroDeserializer. – driven_spider Mar 15 '19 at 20:45
  • Can you clarify what **does** happen? – OneCricketeer Mar 16 '19 at 20:18
  • 2
    @cricket_007 : The application starts and sets the kafka-consumer look at this value that AvroDeserializer is my own class I couldnt get kafka connection established from confluent's avro consumer. -> ***value.deserializer = class org.apache.camel.example.kafka.AvroDeserializer*** 2019-03-18 07:56:40,663 [nsumer[avro-t1]] WARN KafkaConsumer - ***KafkaException consuming avro-t1-Thread 0 from topic avro-t1. Will attempt to re-connect on next run*** ----- this is the exception and the application keep re-connecting to kafka broker & hence fails. – driven_spider Mar 18 '19 at 02:29
  • Hmm. I don't recongize this `org.apache.camel.example.kafka.AvroDeserializer`. In theory, any deserializer on the classpath should work. Could you enable debug logging maybe? – OneCricketeer Mar 18 '19 at 20:00
  • @cricket_007: thanks for helping me in thinking process. – driven_spider Mar 19 '19 at 05:43

3 Answers3

3

I'm answering my own question because I sat on this problem for couple days. I hope this answer will be helpful for others.

I tried to use io.confluent.kafka.serializers.KafkaAvroDeserializer deserializer and got kafka exception. so i had to write my own deserializer to do following things:

  1. set schema registry
  2. use specific avro reader (which means not the default stringDeserializer)

then we must access "schemaRegistry", "useSpecificAvroReader" and set those fields of the AbstractKafkaAvroDeserializer(io.confluent.kafka.serializers.AbstractKafkaAvroDeserializer)

Here is the solution...

CAMEL-KAFKA-AVRO-ROUTE-BUILDER

public static void main(String[] args) throws Exception {
    LOG.info("About to run Kafka-camel integration...");
    CamelContext camelContext = new DefaultCamelContext();
    // Add route to send messages to Kafka
    camelContext.addRoutes(new RouteBuilder() {
        public void configure() throws Exception {                
            PropertiesComponent pc = getContext().getComponent("properties", 
                                      PropertiesComponent.class);
            pc.setLocation("classpath:application.properties");

            log.info("About to start route: Kafka Server -> Log ");

            from("kafka:{{consumer.topic}}?brokers={{kafka.host}}:{{kafka.port}}"
                    + "&maxPollRecords={{consumer.maxPollRecords}}"
                    + "&consumersCount={{consumer.consumersCount}}"
                    + "&seekTo={{consumer.seekTo}}"
                    + "&groupId={{consumer.group}}"
                    + "&keyDeserializer="+ StringDeserializer.class.getName() 
                    + "&valueDeserializer="+CustomKafkaAvroDeserializer.class.getName()
                    )
                    .routeId("FromKafka")
                .log("${body}");

        }
    });
    camelContext.start();
    // let it run for 5 minutes before shutting down
    Thread.sleep(5 * 60 * 1000);
    camelContext.stop();
}

DESERIALIZER CLASSS - this sets the schema.registry.url & use.specific.avro.reader at the abstract AbstractKafkaAvroDeserializer level. If I don't set this, I would get kafka-config-exception.

package com.example.camel.kafka.avro;

import java.util.Collections;
import java.util.List;
import java.util.Map;


import io.confluent.common.config.ConfigException;
import io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient;
import io.confluent.kafka.serializers.AbstractKafkaAvroDeserializer;
import io.confluent.kafka.serializers.KafkaAvroDeserializerConfig;
import org.apache.kafka.common.serialization.Deserializer;



public class CustomKafkaAvroDeserializer extends AbstractKafkaAvroDeserializer
    implements Deserializer<Object> {

    private static final String SCHEMA_REGISTRY_URL = "http://localhost:8081";

    @Override
    public void configure(KafkaAvroDeserializerConfig config) {

     try {
          final List<String> schemas = 
                              Collections.singletonList(SCHEMA_REGISTRY_URL);
          this.schemaRegistry = new CachedSchemaRegistryClient(schemas, 
                                  Integer.MAX_VALUE);
          this.useSpecificAvroReader = true;

       } catch (ConfigException e) {
              throw new org.apache.kafka.common.config.ConfigException(e.getMessage());
     }
   }

  @Override
  public void configure(Map<String, ?> configs, boolean isKey) {
    configure(null);
  }

  @Override
  public Object deserialize(String s, byte[] bytes) {
    return deserialize(bytes);
  }

  @Override
  public void close() {
  }
}
OneCricketeer
  • 179,855
  • 19
  • 132
  • 245
driven_spider
  • 495
  • 4
  • 16
  • btw, it was recently added - https://stackoverflow.com/a/55437784/2308683 – OneCricketeer Apr 08 '19 at 22:51
  • @cricket_007: I couldn't see at that time when i was searching and wanted a solution. Thanks for your profound search. – driven_spider Apr 22 '19 at 10:27
  • Hey, just wondering was this a work project or something you could share? I would love to see the full example on github or somewhere like that? thanks – AnonymousAlias Jun 17 '19 at 13:49
  • 1
    @MickO'Gorman: sorry that i didn't look at this before. Look at my route: https://github.com/hykavitha/camel-kafka-microservice/blob/master/src/main/java/com/marriott/poc/crm/route/MsgRouter_CamelRouteConfig.java Look at the CustomKafkaAvroDeserializer https://github.com/hykavitha/camel-kafka-microservice/blob/master/src/main/java/com/marriott/poc/crm/kafka/avro/serializer/CustomKafkaAvroDeserializer.java – driven_spider Jun 26 '19 at 17:42
2

Using camel-kafka-starter (for spring boot) version: 3.6.0, you don't need to define a CustomKafkaAvroDeserializer. Instead, add the following configurations details to your application.yaml or application.properties file and the camel-kafka component (both producer and consumer) will apply the appropriate SerDe to the objects/bytes being processed.

camel:
  springboot:
    main-run-controller: true # to keep the JVM running
  component:
    kafka:
      brokers: http://localhost:9092
      schema-registry-u-r-l: http://localhost:8081

      #Consumer
      key-deserializer: org.apache.kafka.common.serialization.StringDeserializer
      value-deserializer: io.confluent.kafka.serializers.KafkaAvroDeserializer

      #Producer
      key-serializer-class: org.apache.kafka.common.serialization.StringSerializer
      serializer-class: io.confluent.kafka.serializers.KafkaAvroSerializer

      specific-avro-reader: true

You also need to make sure to upload the respective avro-schema-json files to your schema-registry server e.g confluent-schema-registry before running producer/consumer.

1

I struggled with the same question for a while. I made full example with camel-quarkus and schema-registry from confluent: https://github.com/tstuber/camel-quarkus-kafka-schema-registry

Dharman
  • 30,962
  • 25
  • 85
  • 135
tstuber
  • 352
  • 2
  • 14