Questions tagged [apache-kafka-connect]

Apache Kafka Connect is a tool for scalable and reliable streaming data between Apache Kafka and other data systems.

Apache Kafka Connect is a tool for scalable and reliable streaming data between Apache Kafka and other data systems.

It was first released with Kafka 0.9. It allows to import data from external systems (e.g., databases) into Kafka and it allows to export data from Kafka into external system (e.g., Hadoop). Apache Kafka Connect is a framework that supports a plug-in mechanism allowing to provide custom connectors for your system of choice.

Documentation

3693 questions
1
vote
0 answers

Getting error while ingesting a nested Avro in a MS SQL table using kafka-connect-jdbc in kafka

As part of POC, I am trying to ingest Avro messages with schema registry enabled from Kafka Topics into JDBC Sink(MS SQL Database).But i am facing some issues while ingesting nested avro data to a MS Sql table. I am using kafka-connect-jdbc-sink to…
1
vote
1 answer

Can varchar datatype be a timestamp in Confluent?

I'm using confluent to implement realtime ETL. My datasource is oracle, every table has a column named ts ,it's data type is varchar, but data in this column is YYYY-MM--DD HH24:MI:SS format. can I use this column as timestamp in confluent kafka…
1
vote
1 answer

Is there any way to forward Kafka messages from topic on one server to topic on another server?

I have a scenario where we are forwarding our application logs to Kafka topic using fluentD agents, as Kafka team introduced Kerberos authentication and fluentD version not supporting this authentication, I cannot directly use forward logs. Now we…
veeresh gutti
  • 31
  • 1
  • 4
1
vote
1 answer

Using Schemaless JSON converter for Hbase Connector Kafka

I'm using the hbase sink connector for kafka (https://github.com/mravi/kafka-connect-hbase). So I tried to implement this connector using its JsonConverter in event parser class like below. { "name": "test-hbase", "config": { …
boy
  • 29
  • 5
1
vote
1 answer

Creating JDBC connector source for interbase

I am trying to create a JDBC connector for kafka using curl command. Please help me to correct this command. curl -X POST -H "Content-Type: application/json" --data "{ \"name\": \"ib_connector\",\"config\": { \"connector.class\":…
Bommu
  • 229
  • 1
  • 4
  • 14
1
vote
1 answer

Starting multiple connectors in Kafka Connect withing single distributed worker?

How to start multiple Kafka connectors in a Kafka Connect world within a single distributed worker(running on 3 different servers)? Right now I have a need of 4 Kafka Connectors in this distributed worker(same group.id). Currently, I am adding one…
suraj_fale
  • 978
  • 2
  • 21
  • 53
1
vote
1 answer

Kafka Connect not working with Subject Strategies

Context I coded a couple of small Kafka Connect connectors. One that just generates random data each second and another that logs it in the console. They're integrated with a Schema Registry so the data is serialized with Avro. I deployed them into…
1
vote
1 answer

Confluent Kafka Connect - JdbcSourceTask: java.sql.SQLException: Java heap space

I am trying to use mode timestamp with mysql, but it does not create any topic in my kafka queue when I do so and there is also no error log. Here are the connector properties that I am using, { "name":…
1
vote
1 answer

"errors.deadletterqueue.topic.name" no effect in Confluent 5.0.1

I have upgraded from confluent 4.0.1 to confluent 5.0.1 recently. The bootstrap.server's version is Kafka 1.0. In my HBaseSink Connector, I have configured the new feature ""errors.deadletterqueue.topic.name" as follows: { "name":…
SkyOne
  • 188
  • 3
  • 15
1
vote
1 answer

Two (Kafka) S3 Connectors not working simultaneously

I have a Kafka Connect working in a cluster (3 nodes) with 1 connector (topic -> S3), everything is fine: root@dev-kafka1 ~]# curl localhost:8083/connectors/s3-postgres/status | jq -r % Total % Received % Xferd Average Speed Time Time …
1
vote
1 answer

In Kafka Connector, how do I get the bootstrap-server address My Kafka Connect is currently using?

I'm developing a Kafka Sink connector on my own. My deserializer is JSONConverter. However, when someone send a wrong JSON data into my connector's topic, I want to omit this record and send this record to a specific topic of my company. My confuse…
SkyOne
  • 188
  • 3
  • 15
1
vote
1 answer

How can Kafka mqtt connector send mqtt topic as key?

I have a MQTT broker and a Kafka broker running, I have used the kafka-connector: https://github.com/Landoop/stream-reactor, with the next…
Asier Gomez
  • 6,034
  • 18
  • 52
  • 105
1
vote
2 answers

How to add column with the kafka message timestamp in kafka sink connector

I am configuring my connector using properties/json files, I am trying to add a timestamp column containing the kafka timestamp when it read the message from source connector without any success. I have tried to add transforms, but it's always null…
Sano
  • 469
  • 2
  • 6
  • 21
1
vote
1 answer

How to add JDBC driver to Kafka Connect on DC/OS?

running Kafka Connect 4.1.1 on DC/OS using the confluent community package. How can we upload or add our jdbc driver to the remote cluster? Update: It's a package installed DC/OS catalog, which is a mesos framework, running docker images.
user432024
  • 4,392
  • 8
  • 49
  • 85
1
vote
1 answer

ExtractField and Parse JSON in kafka-connect sink

I have a kafka-connect flow of mongodb->kafka connect->elasticsearch sending data end to end OK, but the payload document is JSON encoded. Here's my source mongodb document. { "_id": "1541527535911", "enabled": true, "price": 15.99, "style":…
Peter Lyons
  • 142,938
  • 30
  • 279
  • 274