Questions tagged [apache-kafka-connect]

Apache Kafka Connect is a tool for scalable and reliable streaming data between Apache Kafka and other data systems.

Apache Kafka Connect is a tool for scalable and reliable streaming data between Apache Kafka and other data systems.

It was first released with Kafka 0.9. It allows to import data from external systems (e.g., databases) into Kafka and it allows to export data from Kafka into external system (e.g., Hadoop). Apache Kafka Connect is a framework that supports a plug-in mechanism allowing to provide custom connectors for your system of choice.

Documentation

3693 questions
5
votes
1 answer

Force Confluent s3 sink to flush

I setup kafka connect s3 sink, duration set to 1 hour, and also I setup a rather big flush count, say 10,000. Now if there is not many message in the kafka channel, s3 sink will try to buffer them in memory, and wait it to accumulate to the flush…
5
votes
4 answers

How to change the "kafka connect" component port?

On port 8083 I am running Influxdb for which I am even getting the GUI on http://localhost:8083 Now come to kafka, Here I am referring the setup as per https://kafka.apache.org/quickstart starting the zookeeeper which is in folder…
surya rahul
  • 833
  • 2
  • 14
  • 27
5
votes
3 answers

Which jmx metric should be used to monitor the status of a connector in kafka connect?

I'm using the following jmx metrics for kafka connect.
Jigar Mehta
  • 73
  • 3
  • 10
5
votes
1 answer

How to properly restart a kafka s3 sink connect?

I started a kafka s3 sink connector (bundle connector from confluent package) since 1 May. It works fine until 8 May. Checking the status, it tells that some aws exception crashes this connector. This should not be a big problem, so I want to…
5
votes
7 answers

Kafka : Running Confluent in a Windows environment

I set up Kafka for local run. I have written sample producer and consumer in Java and running from local, by starting server and zookeeper. I want to use oracle as producer, that will require to write the configuration file(already written),…
5
votes
1 answer

Connector config contains no connector type

I'm trying to use JDBC Connector to connect to a PostgreSQL database on my cluster (the database is not directly managed by the cluster). I've been calling the Kafka Connect with the following command: connect-standalone.sh worker.properties…
frollo
  • 1,296
  • 1
  • 13
  • 29
5
votes
0 answers

Can I access Kafka Connect Worker config from connector or task?

I am developing a custom Kafka source connector. I would like to access the worker configuration, such as key converter or value converter or schema registry url or zookeeper url etc., but I didn't find a way to do that. Any idea? Is that…
Xiang Zhang
  • 2,831
  • 20
  • 40
5
votes
1 answer

On what nodes should Kafka Connect distributed be deployed on Azure Kafka for HD Insight?

We are running a lot of connectors on premise and we need to go to Azure. These on premise machines are running Kafka Connect API on 4 nodes. We deploy this API executing this on all these machines: export…
5
votes
3 answers

How can a org.apache.kafka.connect.data.Decimal stored in an avro file be converted to a python type?

I am trying to interpret a Avro record stored by Debezium in Kafka, using Python { "name": "id", "type": { "type": "bytes", "scale": 0, "precision": 64, …
5
votes
1 answer

Kafka-ES-Sink : ConnectException: Key is used as document id and can not be null

I am trying to add key using SMT functions to use it as document id for ES document but it is not working. I am using confluent es connector. Config file…
5
votes
1 answer

Kafka Connect - Cannot ALTER to add missing field SinkRecordField{schema=Schema{BYTES}, name='CreateUID', isPrimaryKey=true},

I am using JDBC source connector to read data from a Teradata table and push to Kafka topic . But when I am trying to use JDBC sink connector to read Kafka topic and push to Oracle table it throws the below ERROR. I am sure the error is because of…
Digvijay Waghela
  • 227
  • 1
  • 5
  • 15
5
votes
2 answers

kafka connector HTTP/API source

I am actually aware on how to capture data from any data source, such as a specific API (e.g HTTP GET request) and ingest them in specific kafka connector. { "name": "localfileSource", "config": { "connector.class":…
Alex
  • 51
  • 1
  • 4
5
votes
3 answers

Kafka Connect transforming JSON string to actual JSON

I'm trying to figure out whether it's possible to transform JSON values that are stored as strings into actual JSON structures using Kafka Connect. I tried looking for such a transformation but couldn't find one. As an example, this could be the…
Evaldas Buinauskas
  • 13,739
  • 11
  • 55
  • 107
5
votes
1 answer

What is the reasoning behind Kafka Connect Schemas?

We are writing a custom sink connector for writing content of a topic with avro messages to a CEPH storage. To do this we are provided with SinkRecords which have a Kafka Connect schema which is a mapped version of our avro schema. Since we want to…
5
votes
1 answer

Kafka-cassandra connector fails after confluent 3.3 upgrade

The Cassandra connector fails after confluent upgrade to 3.3.0. The version of Cassandra driver is 3.3. The stack is given below. [2017-09-14 08:56:28,123] ERROR java.lang.reflect.InvocationTargetException…