0

We are trying to send Json events from Kafka topic to an http endpoint using camel-http-sink-kafka-connector, getting an InvalidPayloadException whenever an event is produced to Kafka topic.

Below steps were followed for the setup:

Built an uber jar of camel-http-sink-kafka-connector and added that to plugin.path Here is the configuration properties of sink connector:

name=camel-sink
connector.class=org.apache.camel.kafkaconnector.httpsink.CamelHttpsinkSinkConnector
tasks.max=1
errors.deadletterqueue.topic.name=errors-topic
errors.deadletterqueue.topic.replication.factor=1
errors.deadletterqueue.context.headers.enable=true
errors.tolerance=all
value.converter.schemas.enable=false
output.data.format=JSON
key.converter=org.apache.kafka.connect.storage.StringConverter
value.converter=org.apache.kafka.connect.json.JsonConverter
topics=new-topic
camel.kamelet.http-sink.method=POST
bootstrap.servers=localhost:9092
camel.kamelet.http-sink.url=http://localhost:8090/post
camel.idempotency.kafka.topic=new-topic
camel.idempotency.kafka.bootstrap.servers=localhost:9092

Started the sink connector in standalone mode with following command:

-> bin/connect-standalone.sh config/connect-standalone.properties config/connect-http-sink.properties

Adding Json event to configured kafka topic via sample springboot code with producer factory as:

@Bean
public ProducerFactory<String, User> producerFactory() {Map<String, Object> configProps = new HashMap<>();configProps.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");configProps.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);configProps.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, JsonSerializer.class);return new DefaultKafkaProducerFactory<>(configProps);}

Whenever a Json event is added to topic, getting following exception:

(org.apache.camel.processor.errorhandler.DefaultErrorHandler:205)
org.apache.camel.InvalidPayloadException: No body available of type: java.io.InputStream but has type: java.util.HashMap on: Message. Caused by: No type converter available to convert from type: java.util.HashMap to the required type: java.io.InputStream. Exchange\[A6C22F5BE5CA5B0-0000000000000000\]. Caused by: \[org.apache.camel.NoTypeConversionAvailableException - No type converter available to convert from type: java.util.HashMap to the required type: java.io.InputStream\]
      at org.apache.camel.support.MessageSupport.getMandatoryBody(MessageSupport.java:125)
      at org.apache.camel.component.http.HttpProducer.createRequestEntity(HttpProducer.java:745)
      at org.apache.camel.component.http.HttpProducer.createMethod(HttpProducer.java:632)
      at org.apache.camel.component.http.HttpProducer.process(HttpProducer.java:162)
      at org.apache.camel.support.AsyncProcessorConverterHelper$ProcessorToAsyncProcessorBridge.process(AsyncProcessorConverterHelper.java:66)
      at org.apache.camel.processor.SendProcessor.process(SendProcessor.java:172)
      at org.apache.camel.processor.errorhandler.RedeliveryErrorHandler$SimpleTask.run(RedeliveryErrorHandler.java:463)
      at org.apache.camel.impl.engine.DefaultReactiveExecutor$Worker.schedule(DefaultReactiveExecutor.java:181)
      at org.apache.camel.impl.engine.DefaultReactiveExecutor.scheduleMain(DefaultReactiveExecutor.java:59)
      at org.apache.camel.processor.Pipeline.process(Pipeline.java:184)
      at org.apache.camel.impl.engine.CamelInternalProcessor.process(CamelInternalProcessor.java:392)
      at org.apache.camel.component.direct.DirectProducer.process(DirectProducer.java:96)
      at org.apache.camel.impl.engine.SharedCamelInternalProcessor.process(SharedCamelInternalProcessor.java:214)
      at org.apache.camel.impl.engine.SharedCamelInternalProcessor$1.process(SharedCamelInternalProcessor.java:111)
      at org.apache.camel.impl.engine.DefaultAsyncProcessorAwaitManager.process(DefaultAsyncProcessorAwaitManager.java:83)
      at org.apache.camel.impl.engine.SharedCamelInternalProcessor.process(SharedCamelInternalProcessor.java:108)
      at org.apache.camel.support.cache.DefaultProducerCache.send(DefaultProducerCache.java:199)
      at org.apache.camel.impl.engine.DefaultProducerTemplate.send(DefaultProducerTemplate.java:176)
      at org.apache.camel.impl.engine.DefaultProducerTemplate.send(DefaultProducerTemplate.java:148)
      at org.apache.camel.kafkaconnector.CamelSinkTask.put(CamelSinkTask.java:205)
      at org.apache.kafka.connect.runtime.WorkerSinkTask.deliverMessages(WorkerSinkTask.java:583)
      at org.apache.kafka.connect.runtime.WorkerSinkTask.poll(WorkerSinkTask.java:336)
      at org.apache.kafka.connect.runtime.WorkerSinkTask.iteration(WorkerSinkTask.java:237)
      at org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:206)
      at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:202)
      at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:257)
      at org.apache.kafka.connect.runtime.isolation.Plugins.lambda$withClassLoader$1(Plugins.java:177)
      at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
      at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
      at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
      at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
      at java.base/java.lang.Thread.run(Thread.java:834)

Also, if we are using value.converter = org.apache.kafka.connect.storage.StringConverter, we are able to hit the http endpoint but the Media type of that api call is OCTET_STREAM instead of application/json.

If using org.apache.kafka.connect.json.JsonConverter with connnect-file-3.4.0.jar as sink, we are able to receive Json events in destination file. What is the issue above? Any help would be really appreciated as we are on a strict timeline.

My Objective is to use JsonConverter to push Json events from Kafka topic to HTTP endpoint.

pmacfarlane
  • 3,057
  • 1
  • 7
  • 24

0 Answers0