Questions tagged [apache-kafka-connect]

Apache Kafka Connect is a tool for scalable and reliable streaming data between Apache Kafka and other data systems.

Apache Kafka Connect is a tool for scalable and reliable streaming data between Apache Kafka and other data systems.

It was first released with Kafka 0.9. It allows to import data from external systems (e.g., databases) into Kafka and it allows to export data from Kafka into external system (e.g., Hadoop). Apache Kafka Connect is a framework that supports a plug-in mechanism allowing to provide custom connectors for your system of choice.

Documentation

3693 questions
5
votes
1 answer

Kafka Connect with a JdbcConnectionSource connector fails to create task (connector is RUNNING but task is not)

It seems like rather often I create a Kafka Connect connector from the JdbcConnectionSource based on a query, and the connector is created successsfully with status "RUNNING", but no task is created. Looking in the console logs of my container, I…
Patrick Szalapski
  • 8,738
  • 11
  • 67
  • 129
5
votes
1 answer

How can I send data without schema to kafka - confluent jdbc - sink usage?

I load my data from kafka to oracle with conluent jdbc-sink. But I write my schema on value with data. I do not want to write schema with data , how can I write schema on kafka topic and then I want to send just data from my client? thanks in…
CompEng
  • 7,161
  • 16
  • 68
  • 122
5
votes
3 answers

Kafka Log Compacted Topic Duplication Values against same key not deleted

Log compacted topics are not supposed to keep duplicates against the same key. But in our case, when a new value with the same key is sent, the previous one isn't deleted. What could be the issue? val TestCompactState: KTable[String, TestCompact] =…
5
votes
4 answers

Debezium: No maximum LSN recorded in the database; please ensure that the SQL Server Agent is running

This question is related to: Debezium How do I correctly register the SqlServer connector with Kafka Connect - connection refused In Windows 10, I have Debezium running on an instance of Microsoft SQL Server that is outside of a Docker container. I…
J Weezy
  • 3,507
  • 3
  • 32
  • 88
5
votes
1 answer

Can I map multiple buckets with multiple topics in a single Kafka-Connector S3 sink connector?

Google didn't help me, so I want to ask you. I have a lot of kafka topics, and I want to store the messages of a particular topic in a particular S3 bucket. Do I need to create an S3 sink connector for each bucket or can I configure all the stuff…
ecurbelo
  • 333
  • 3
  • 11
5
votes
1 answer

Connector fails when schema registry's master changes

My source connector throws Caused by: io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException: Error while forwarding register schema request to the master; error code: 50003 or Caused by:…
Holm
  • 2,987
  • 3
  • 27
  • 48
5
votes
1 answer

Kafka Connect - Failed to commit offsets and flush

I had my Kafka Connectors paused and upon restarting them got these errors in my logs [2020-02-19 19:36:00,219] ERROR WorkerSourceTask{id=wem-postgres-source-0} Failed to commit offsets…
AnonymousAlias
  • 1,149
  • 2
  • 27
  • 68
5
votes
1 answer

Does errors.deadletterqueue.topic.name work for source connector

Does "errors.deadletterqueue.topic.name" work for source connector? I tested with JDBC sink connector and it works, but I don't find a record which has serialization error goes to dead letter queue. I use Debezium Connector for MongoDB and…
Holm
  • 2,987
  • 3
  • 27
  • 48
5
votes
2 answers

Kafka MirrorMaker 2.0 duplicate each messages

I am trying to replicate Kafka cluster with MirrorMaker 2.0. I am using following mm2.properties: name = mirror-site1-site2 topics = .* connector.class = org.apache.kafka.connect.mirror.MirrorSourceConnector tasks.max =…
5
votes
3 answers

Kafka Connect JDBC Sink Connector - java.sql.SQLException: No suitable driver found

I'm trying to sink the table data one DB to another DB using kafka debezium ( Kafka streaming ) with the help of docker. DB stream is working fine. But streamed data to sink another MySQL DB process getting an error. For my connector sink…
5
votes
3 answers

Facing issue in Connecting Kafka 3.0 - org.apache.kafka.common.KafkaException: Failed to load SSL keystore

I am trying to connect to Kafka 3.0 with SSL but facing issue with loading SSL keystore I have tried many possible values, but no help I have tried changing the locations, changing the value of the location, but still that didnt help package…
5
votes
2 answers

Implementing a kafka connect custom partitioner

I'm using confluent's kafka connect to pipe data into a s3 bucket. Ideally partitioned based on a key. Since the existing FieldPartitioner only works for Avro schema records and not for general stringnified JSON texts. I thought i'd write my own…
Pita
  • 1,444
  • 1
  • 19
  • 29
5
votes
2 answers

Kafka Connector for Oracle Database Source

I want to build a Kafka Connector in order to retrieve records from a database at near real time. My database is the Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 and the tables have millions of records. First of all, I would like to add…
5
votes
4 answers

Kafka producer can't create topics and throwing continuous error after creating Debezium MySQL connector

I am using Debezium as a CDC tool to stream data from MySql. After installing Debezium MySQL connector to Confluent OSS cluster, I am trying to capture MySQL bin_log changes into a Kafka topic. When I create a connector, after taking the snapshot of…
Rahul Gupta
  • 51
  • 1
  • 3
5
votes
1 answer

How to run Kafka Connect connectors automatically (e.g. in production)?

Is there a way to automatically load (multiple) Kafka Connect connectors upon the start of Kafka Connect (e.g. in Confluent Platform)? What I've found out so far: Confluent Docs state to use the bin/connect-standalone the command for Standalone Mode…