0

I am trying to sink kafka topic records into s3 bucket using kafka-connect + camel-kafka-connector-0.9.

The connector loads up fine, I can see it connected to kafka as the consumer can be seen in AKHQ. But it fails immediately just after trying to commit offsets with the following exception from the kafka-connect pod:

org.apache.kafka.connect.errors.ConnectException: Exiting WorkerSinkTask due to unrecoverable exception.
    at org.apache.kafka.connect.runtime.WorkerSinkTask.deliverMessages(WorkerSinkTask.java:614)
    at org.apache.kafka.connect.runtime.WorkerSinkTask.poll(WorkerSinkTask.java:329)
    at org.apache.kafka.connect.runtime.WorkerSinkTask.iteration(WorkerSinkTask.java:232)
    at org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:201)
    at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:189)
    at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:238)
    at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
    at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
    at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
    at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
    at java.base/java.lang.Thread.run(Thread.java:829)
Caused by: java.lang.ClassCastException: class java.math.BigDecimal cannot be cast to class [B (java.math.BigDecimal and [B are in module java.base of loader 'bootstrap')
    at org.apache.camel.kafkaconnector.CamelSinkTask.mapHeader(CamelSinkTask.java:233)
    at org.apache.camel.kafkaconnector.CamelSinkTask.put(CamelSinkTask.java:184)
    at org.apache.kafka.connect.runtime.WorkerSinkTask.deliverMessages(WorkerSinkTask.java:586)
    ... 10 more

I don't understand whats going on under the hood and there is little room for debugging...

Kafka-connector config:

{
  "name": "connector-name",
  "config": {
    "connector.class": "org.apache.camel.kafkaconnector.aws2s3.CamelAws2s3SinkConnector",
    "topics": "topic-v1-0",
    "camel.sink.endpoint.region": "region",
    "camel.sink.path.bucketNameOrArn": "bucket-name",
    "camel.sink.endpoint.keyName": "incoming-v1-0/${date:now:yyyyMMdd-HHmmssSSS}-${exchangeId}",
    "camel.sink.endpoint.useDefaultCredentialsProvider": "true",
    "camel.beans.aggregate": "#class:org.apache.camel.kafkaconnector.aggregator.StringAggregator",
    "camel.aggregation.size": "1000",
    "camel.aggregation.timeout": "5000",
    "key.converter": "org.apache.kafka.connect.storage.StringConverter",
    "value.converter": "org.apache.kafka.connect.storage.StringConverter",
    "key.converter.schemas.enable": "false",
    "value.converter.schemas.enable": "false"
  }
}
Dopele
  • 547
  • 3
  • 16
  • 1
    What java version are you using – JCompetence Jan 25 '22 at 14:01
  • @SMA How exactly does that address the ClassCastException? BigDecimal and byte[] exist in basically all Java versions, and still can't cast between them if using a supported one by Camel – OneCricketeer Jan 26 '22 at 15:07
  • @Dopele Debugging definitely is possible. For example https://stackoverflow.com/questions/45717658/what-is-a-simple-effective-way-to-debug-custom-kafka-connectors But you should find the source code for CamelSinkTask.mapHeader method first – OneCricketeer Jan 26 '22 at 15:10
  • @OneCricketeer The fact that the connector only supports version 8 and 11 states something about the Java compatibility in this case. – JCompetence Jan 26 '22 at 15:36

0 Answers0