I have a producer class sends to a topic using custom JsonSerializer from Github
public class JsonSerializer<T> implements Serializer<T> {
...
@Override
public byte[] serialize(String topic, T data) {
try {
return this.objectMapper.writeValueAsBytes(data);
} catch (JsonProcessingException e) {
throw new SerializationException(e);
}
}
...
}
And I am running Datastax Kafka Connector using these configuration:
value.converter=org.apache.kafka.connect.json.JsonConverter
value.converter.schemas.enable=false
I got these error while the connector trying to consume topic:
[2020-01-12 13:57:53,324] WARN Error inserting/updating row for Kafka record SinkRecord{kafkaOffset=416, timestampType=CreateTime} ConnectRecord{topic='test-3', kafkaPartition=17, key=null, keySchema=Schema{STRING}, value={}, valueSchema=null, timestamp=1578811437723, headers=ConnectHeaders(headers=)}: Primary key column(s) mmsi, ts cannot be left unmapped. Check that your mapping setting matches your dataset contents. (com.datastax.kafkaconnector.DseSinkTask:286)
From that error, I am thinking Connector unable to retrieve Json data. What am I doing wrong?
UPDATE
I tried Kafka JsonSerializer.
I tried StringSerializer, as connector said it is supported also.
I found that some data actually written to database, but it is always relative small number compared to total data sent by kafka topic. About 5 to 10 data.
I tried to keep connector running, and I found after it failed wrote, it will not write anymore.