0

I have a producer class sends to a topic using custom JsonSerializer from Github

public class JsonSerializer<T> implements Serializer<T> {
    ...
    @Override
    public byte[] serialize(String topic, T data) {
        try {
            return this.objectMapper.writeValueAsBytes(data);
        } catch (JsonProcessingException e) {
            throw new SerializationException(e);
        }
    }
    ...
}

And I am running Datastax Kafka Connector using these configuration:

value.converter=org.apache.kafka.connect.json.JsonConverter
value.converter.schemas.enable=false

I got these error while the connector trying to consume topic:

[2020-01-12 13:57:53,324] WARN Error inserting/updating row for Kafka record SinkRecord{kafkaOffset=416, timestampType=CreateTime} ConnectRecord{topic='test-3', kafkaPartition=17, key=null, keySchema=Schema{STRING}, value={}, valueSchema=null, timestamp=1578811437723, headers=ConnectHeaders(headers=)}: Primary key column(s) mmsi, ts cannot be left unmapped. Check that your mapping setting matches your dataset contents. (com.datastax.kafkaconnector.DseSinkTask:286)

From that error, I am thinking Connector unable to retrieve Json data. What am I doing wrong?

UPDATE

I tried Kafka JsonSerializer.

I tried StringSerializer, as connector said it is supported also.

I found that some data actually written to database, but it is always relative small number compared to total data sent by kafka topic. About 5 to 10 data.

I tried to keep connector running, and I found after it failed wrote, it will not write anymore.

panoet
  • 3,608
  • 1
  • 16
  • 27
  • what is your mapping in the connector config, and what is the database schema? – Alex Ott Jan 12 '20 at 11:08
  • [Apache Kafka already has a JSON Serializer](https://github.com/apache/kafka/blob/trunk/connect/json/src/main/java/org/apache/kafka/connect/json/JsonSerializer.java) – OneCricketeer Jan 12 '20 at 12:17
  • Accoring to the error, you had no value to write to the database `value={}, valueSchema=null` – OneCricketeer Jan 12 '20 at 12:34
  • @cricket_007 After some research, I tried using kafka jsonserializer. And another error happenned. I'll update here once I got my computer. – panoet Jan 12 '20 at 12:44

1 Answers1

0

Actually it is configuration related problem. As I mentioned in update, it never wrote data anymore in case of error.

It is because Datastax have configuration ignoreErrors those have default value false. It means if Connector found an error in a message, it will retry it indefinitely. I set it to true, and problem solved.

panoet
  • 3,608
  • 1
  • 16
  • 27