Questions tagged [apache-kafka-connect]

Apache Kafka Connect is a tool for scalable and reliable streaming data between Apache Kafka and other data systems.

Apache Kafka Connect is a tool for scalable and reliable streaming data between Apache Kafka and other data systems.

It was first released with Kafka 0.9. It allows to import data from external systems (e.g., databases) into Kafka and it allows to export data from Kafka into external system (e.g., Hadoop). Apache Kafka Connect is a framework that supports a plug-in mechanism allowing to provide custom connectors for your system of choice.

Documentation

3693 questions
1
vote
1 answer

MQTT topics and kafka topics mapping

I have started to learn about MQTT as I have a use case in telematics in my current organisation. I would like to integrate MQTT broker ( mosquitto ) messages to my kafka. Since every vehicle is sending the data in its own topic in MQTT broker…
Nipun
  • 4,119
  • 5
  • 47
  • 83
1
vote
1 answer

Mqtt-Kafka connector authentication

I need help because I'm new about kafka and mqtt. I will try to briefly explain the architecture I am using. I'm using the Confluent 5.3.1 platform and I configured a connector (source mqtt connector) that transfers data from the mqtt broker to…
1
vote
2 answers

Kafka connect - Transformers - Blacklist a nested field

Is it possible to remove a nested field using SMT with Kafka connect ? I know the following works perfectly : "transforms": "ReplaceField", "transforms.ReplaceField.type":…
Yannick
  • 1,240
  • 2
  • 13
  • 25
1
vote
0 answers

Kafka Connect JDBC Connector | Closing JDBC Connection after each poll

We are going to use use Kafka Connect JDBC Source Connector to ingest data from Oracle Databases. We have one Kafka JDBC Connector per one Oracle Db. Looking at the JDBC Connector implementation ,if we have N number of maxTasks per Connector (inside…
Ashika Umanga Umagiliya
  • 8,988
  • 28
  • 102
  • 185
1
vote
0 answers

Define custom format for kafka connect s3 sink connector

I would like to use kafka connect s3 sink connector to stream data our of a topic to s3 bucket. The data inside the topic will be xml messages. As per connector config, we can define the format of the message(for example: JsonFormat) As per the…
VSK
  • 359
  • 2
  • 5
  • 20
1
vote
0 answers

Kafka Connector Implementation: Is possible to get the task number from the SourceTask?

I'm creating a Kafka Connector that loads a list of elements to be requested. This list of elements is being distributed between the different tasks. Lets say we have 100 elements, and 4 tasks.max configured, each task will have 25 elements to…
apenlor
  • 87
  • 2
  • 15
1
vote
1 answer

Tombstone records from Kafka Connect

Is it possible to configure Kafka Connect (Source) to generate a tombstone record? I have a table recording 'delete' events. I can populate this to a topic and write some code to forward tombstone records to other topics as needed, but if I can…
Debbie R
  • 53
  • 2
1
vote
1 answer

Failed to connect to and describe Kafka cluster. Apache kafka connect

I have setup MSK cluster in aws and created an EC2 instance in the same vpn. I tried the kafka-console-consumer.sh and kafka-console-producer.sh and it works fine. I was able to see the messages sent by producer in consumer 1) I have downloaded the…
VSK
  • 359
  • 2
  • 5
  • 20
1
vote
1 answer

How to setup mongo-kafka-connect?

I am not using confluent, I am able to run the Zookeeper and Kafka successfully. And following the steps to Mongo-kafka connect using jar file and am getting an error. Once I download the mongo-kafka-connect-0.2-all.jar file from maven tech do I…
padma Raj
  • 11
  • 2
1
vote
0 answers

Read kafka connect worker configs(eg Bootsrap servers property) dynamically inside custom connector

I am trying to write a custom source connector for kafka connect and it is requiring for me to write a simple plain kafka producer inside it (that has no relevance with any of the connector properties).The requirement is to send messages about bad…
Jenison Gracious
  • 486
  • 4
  • 13
1
vote
0 answers

configure kafka connect sink for elastic search 7.1 using docker compose

I am setting up a producer that sends messages as a (key value) [key is a generated unique string, value is a json payload ] to kafka topics (v1.0.0) that are pulled by a kafka connect(v5.3.1) which is then sent to an Elastic search container(v…
1
vote
0 answers

Can Kafka FileSink FileStream Connector write files hourly?

I am trying to write file Sink connector that will consume topic messages and write them to hourly files .I can not find any configuration for File-sink Connector .
Ihab Ame
  • 11
  • 1
1
vote
1 answer

Debezium mongodb kafka connector not producing some of records in topic as it is in mongodb

In my mongodb there i have this data mongo01:PRIMARY> db.col.find({"_id" : ObjectId("5d8777f188fef5555b")}) { "_id" : ObjectId("5d8777f188fef5555b"), "attachments" : [ { "name" : "Je", "src" : "https://google.co", "type" : "image/png" } ], "tags" :…
1
vote
1 answer

Using mongo-kafka as sink connector, how do I map a topic record's value field to another value?

I'm new to both Kafka Connect and MongoDB. I have a record in a Kafka topic with a value of { "Id": "foo" } and I would like the Id to map to BAR when stored as a document in a collection in mongo. Expected result to be { "BAR": "foo" }. What should…
1
vote
1 answer

How to handle dynamic message value in Kafka?

I am having a hard time to implement the feature where I can have dynamic message value for Kafka. I am using AvroProducer from confluent-kafka-python along with schema registry. The producer will send message in a format like this : {'id':1,…