1

I'm trying to listen to a topic to which I posted from using the kafka connect feature of confluent. However, I am not able to deserialize it. I believe that its avro serialisation but not able to find the right deserializer.

the message is like shown below in the console topic

null    {"c1":{"int":10},"c2":{"string":"foo"},"create_ts":1552598863000,"update_ts":1552598863000}

Below is the deserialiser

public class AvroDeserializer<T extends SpecificRecordBase> implements Deserializer<T> {

    private static final Logger LOGGER = LoggerFactory.getLogger(AvroDeserializer.class);

    protected final Class<T> targetType;

    public AvroDeserializer(Class<T> targetType) {
        this.targetType = targetType;
    }

    @Override
    public void close() {
        // No-op
    }

    @Override
    public void configure(Map<String, ?> arg0, boolean arg1) {
        // No-op
    }

    @SuppressWarnings("unchecked")
    @Override
    public T deserialize(String topic, byte[] data) {
        try {
            T result = null;

            if (data != null) {
                LOGGER.debug("data='{}'", DatatypeConverter.printHexBinary(data));

                DatumReader<GenericRecord> datumReader =
                        new SpecificDatumReader<>(targetType.newInstance().getSchema());
                Decoder decoder = DecoderFactory.get().binaryDecoder(data, null);

                result = (T) datumReader.read(null, decoder);
                LOGGER.debug("deserialized data='{}'", result);
            }
            return result;
        } catch (Exception ex) {
            throw new SerializationException(
                    "Can't deserialize data '" + Arrays.toString(data) + "' from topic '" + topic + "'", ex);
        }
    }
}

Exception

org.apache.kafka.common.errors.SerializationException: Error deserializing key/value for partition mysql-foobar-0 at offset 10. If needed, please seek past the record to continue consumption.
Caused by: org.apache.kafka.common.errors.SerializationException: Can't deserialize data '[0, 0, 0, 0, 21, 2, 20, 2, 6, 102, 111, 111, -80, -78, -44, -31, -81, 90, -80, -78, -44, -31, -81, 90]' from topic 'mysql-foobar'
Caused by: java.lang.InstantiationException: null
    at sun.reflect.InstantiationExceptionConstructorAccessorImpl.newInstance(InstantiationExceptionConstructorAccessorImpl.java:48) ~[na:1.8.0_131]
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423) ~[na:1.8.0_131]
    at java.lang.Class.newInstance(Class.java:442) ~[na:1.8.0_131]
    at com.spring.kafkaexample.springbootkafkaconsumer.config.AvroDeserializer.deserialize(AvroDeserializer.java:48) ~[classes/:na]
    at com.spring.kafkaexample.springbootkafkaconsumer.config.AvroDeserializer.deserialize(AvroDeserializer.java:18) ~[classes/:na]
OneCricketeer
  • 179,855
  • 19
  • 132
  • 245
user3310115
  • 1,372
  • 2
  • 18
  • 48

1 Answers1

0

is like shown below in the console topic

It's not clear if you used kafka-avro-console-consumer or simply kafka-console-consumer. The way to know if your data is Avro would be to look at the Producer / connector configuration.


There is no need to write your own deserializer, though. Plus, Confluent doesn't use the convention of Avro schema + message that your code would (thus why you're getting that error). You need to lookup the schema from the Schema Registry first.

Add the Confluent Maven repo

<repositories>

  <repository>
    <id>confluent</id>
    <url>https://packages.confluent.io/maven/</url>
  </repository>

</repositories>

Then add the Confluent serializer dependency

<dependency>
  <groupId>io.confluent</groupId>
  <artifactId>kafka-avro-serializer</artifactId>
  <version>${confluent.version}</version>
</dependency>

Then import io.confluent.kafka.serializers.KafkaAvroDeserializer, or use that class in your consumer configs

https://docs.confluent.io/current/clients/install.html#java


OR, you could switch your MySQL connector to not use the Avro Converter

OneCricketeer
  • 179,855
  • 19
  • 132
  • 245
  • Its avrò. I'm basically trying an exception of the tutorial mentioned in the link https://www.confluent.io/blog/simplest-useful-kafka-connect-data-pipeline-world-thereabouts-part-1/ – user3310115 Mar 16 '19 at 19:53
  • How can I use a normal String serializer instead of a avro. I guess it has something to do with the property `'value.converter": "io.confluent.connect.avro.AvroConverter",` – user3310115 Mar 16 '19 at 19:59
  • You can use `value.converter=org.apache.kafka.connect.json.JsonConverter` – OneCricketeer Mar 16 '19 at 20:01