Questions tagged [apache-kafka-connect]

Apache Kafka Connect is a tool for scalable and reliable streaming data between Apache Kafka and other data systems.

Apache Kafka Connect is a tool for scalable and reliable streaming data between Apache Kafka and other data systems.

It was first released with Kafka 0.9. It allows to import data from external systems (e.g., databases) into Kafka and it allows to export data from Kafka into external system (e.g., Hadoop). Apache Kafka Connect is a framework that supports a plug-in mechanism allowing to provide custom connectors for your system of choice.

Documentation

3693 questions
1
vote
2 answers

Is it possible to apply SMT (Single Message Transforms) to messages from specified topics only

I'm streaming database change events from MySql database to Kafka using Debezium MySql connector. I need to apply specific transformations to records from some specified tables (but not from the others). Is there a way of doing it with using only…
Ruslan
  • 11
  • 2
1
vote
1 answer

Error connecting to Cloud SQL with SSL using Debezium

Objective: To use debezium to capture changes from Cloud SQL. The instance of Cloud SQL is SSL enabled according to the instructions here Scenario: I have debezium connect, kafka and zookpeer running as docker containers on my local machine. I have…
Arko Chakraborti
  • 403
  • 5
  • 19
1
vote
1 answer

Vault for Kafka distributed connectors

I am using a JBoss based vault to secure sensitive data such as the database credentials. I use a Java based HTTP REST client to create distributed Kafka connectors but ended up with a security concern such that a request for the connector's…
Marco99
  • 1,639
  • 1
  • 19
  • 32
1
vote
1 answer

kafka connect - How to filter schema metadata from payload

I'm trying to remove schema from the payload and here are the…
Clover
  • 507
  • 7
  • 22
1
vote
1 answer

push messages from kafka consumer to mongodb

I have created kafka consumer using 'kafka-node', on the event consumer.on('message' ()=>{ connecting to mongodb and inserting to a collection. }) mongo.js file used to create connection to mongo and return the object const MongoClient =…
1
vote
2 answers

Kafka Connect JDBC Connector - Exiting WorkerSinkTask due to unrecoverable exception

I am using the JDBC sink connector and have a bad message in the topic. I know why the message is bad (it is failing due to a FK constraint violation because of a issue with a producer). The error being reported by the worker task…
Sam Shiles
  • 10,529
  • 9
  • 60
  • 72
1
vote
1 answer

Can I have Kafka consumers/sink connects to skip specific partitions within a topic?

Any option in Kafka Connect to specify from which partition specifically to read the messages. Basically, I am looking for an option in Kafka Connects to manually assign a list of partitions to read. Similar to assign() method in KafkaConsumer…
1
vote
2 answers

Kafka and Kafka Connect deployment environment

if I already have Kafka running on premises, is Kafka Connect just a configuration on top of my existing Kafka, or does Kafka Connect require it's own Server/Environment separate from that of my existing Kafka?
Artanis
  • 561
  • 1
  • 7
  • 26
1
vote
2 answers

Kafka Connect S3 sink - how to use the timestamp from the message itself [timestamp extractor]

I've been struggling with a problem using kafka connect and the S3 sink. First the structure: { Partition: number Offset: number Key: string Message: json string Timestamp: timestamp } Normally when posting to Kafka, the timestamp…
Hespen
  • 1,384
  • 2
  • 17
  • 27
1
vote
1 answer

How can I save stream result into remote database via REST or anything easily

I have examined confluent kafka stream wordcount and anomaly detection example. In these example result is written to a topic. Instead of this How can I save the result iinto remote database via REST or anything easily and fastly. Are there any…
validator
  • 13
  • 3
1
vote
1 answer

How to make Kafka Sink Connector work with Avro serialized key and value to postgres

I have a Kafka topic containing messages with an Avro-serialized key and Avro-serialized value. I am trying to setup a sink connector to land these values into a table in a postgres database (AWS RDS in this case). I have tried a number of…
1
vote
1 answer

is Kafka Connect Cassandra compatible with Cosmos Cassandra?

I am using Kafka Connect Cassandra to read from Kafka topic and insert to Cassandra, is this compatible with Cosmos Cassandra?
1
vote
1 answer

Can a Kafka Connector load its own name?

According to Kafka Documentation Connector configurations are simple key-value mappings. For standalone mode these are defined in a properties file and passed to the Connect process on the command line. Most configurations are connector…
Novemberland
  • 530
  • 3
  • 8
  • 25
1
vote
1 answer

Kafka not working after consumer has started to consume data

I am new to Kafka and installed kafka in windows 10 using Steps in https://kafka.apache.org/quickstart In step 5 after starting consumer. I am getting following errors after running following command bin/kafka-console-consumer.sh…
1
vote
0 answers

Kafka connect jdbc sink upsert mode issue

I'm trying to connect replicate a table at realtime using Kafka connect. The database used is MySQLv5.7. On working with insert and update mode separately, the columns are behaving as expected. However, when I use the upsert mode, no change is…