Questions tagged [apache-kafka-connect]

Apache Kafka Connect is a tool for scalable and reliable streaming data between Apache Kafka and other data systems.

Apache Kafka Connect is a tool for scalable and reliable streaming data between Apache Kafka and other data systems.

It was first released with Kafka 0.9. It allows to import data from external systems (e.g., databases) into Kafka and it allows to export data from Kafka into external system (e.g., Hadoop). Apache Kafka Connect is a framework that supports a plug-in mechanism allowing to provide custom connectors for your system of choice.

Documentation

3693 questions
1
vote
1 answer

Kafka Connect ensure ordering

I want to use Mongo change stream to push change events from mongoDB into kafka Topic using Kafka Connect. The good news is: Kafka maintain ordering inside a partition. Mongo maintain ordering using global clock. But, what about the middle? what…
toto
  • 1,197
  • 2
  • 15
  • 26
1
vote
2 answers

Kafka to Snowflake - Failed to find any class that implements Connector

Attempting to connect to a Kafka cluster and write data to Snowflake from a topic. The error I'm getting is: java.util.concurrent.ExecutionException: org.apache.kafka.connect.errors.ConnectException: Failed to find any class that implements…
1
vote
0 answers

MapperParsingException error while indexing kafka topic in elasticsearch with kafka-elasticsearch sink connector

I am using elastic7.4 and confluent kafka-elasticsearch sink connector. 1) Created the kana-elasticsearch sink connector in the confluent with below configuration curl -XPOST -H 'Content-type:application/json' ':8083/connectors' -d '{ "name" :…
1
vote
1 answer

kafka-connect-elasticsearch: how to send deletes of documents?

I have a processing stream that looks like this: mysql.database -> debezium-connector -> database topic -> faust.agent(stream processing to add a field) -> sink topic -> elasticsearch-sink-connector -> elasticsearch cluster This processing stream…
1
vote
0 answers

kafka not retreiving data from clickhouse

I have to push data from Clickhouse to Kafka topics,so I tried to use the Confluent JDBC connector. i am following this tutorial that uses mysql instead of clickhouse. here is my configuration and its works with mysql but has this error with…
1
vote
1 answer

Kafka Mirror Maker not Replicating topics when source and destination clusters are not in the same configuration

We have setup Mirror Maker to replicate topics between two kafka clusters when the source and destination setup are different(source= 2 servers , destination = 3 servers). when starting the MirrorMaker its throws the following error: "Error:…
1
vote
1 answer

Debezium mongodb connector properties to limit cdc to specific collections

Can we limit the amount of data we retrieve in connector properties in debezium mongodb connector configurations.As debezium looks for cdc in database and according to my understanding it's for entire database and i couldn't find a way to limit few…
1
vote
0 answers

CamelCase pattern on Kafka Elastic Connector

I was going through Elastic Search best practices and its advice not to used wildcard but to use CamelCase pattern for better performance. Its defined very well in below link camelcase_tokenizer I am using Elastic Search Kafka Connector to consume…
Nitin
  • 3,533
  • 2
  • 26
  • 36
1
vote
1 answer

Set up kafka-connect on local in Ubuntu to use with debezium and WSO2 stream processor?

This is a newbie question but i am not able to figure out how to install kafka-connect on my local machine ubuntu 18.04 to use it with debezium. I already have kafkaz zoopeeker and wso2 stream processor working fine but for kafka-connect i am facing…
Rahul Anand
  • 523
  • 1
  • 7
  • 20
1
vote
0 answers

How does schema evolution in JDBC Kafka Connector work?

I've configured my confluent schema registry's compatibility to BACKWARD_TRANSITIVE. I'm using confluent jdbc connector to pull incremental changes from MySQL DB. Suppose my initial schema v1 looks like this {"connect.name": "customers", "type":…
1
vote
1 answer

Kafka JDBC Source connector: create topics from column values

I have a microservice that uses OracleDB to publish the system changes in the EVENT_STORE table. The table EVENT_STORE contains a column TYPE with the name of the type of the event. It is possible that JDBC Source Kafka Connect take the EVENT_STORE…
oalmora90
  • 13
  • 4
1
vote
0 answers

Load data from Oracle table with case sensitive name into Kafka

I have Oracle table with case sensitive name and column names. Normally, I can select data from this table with select * from "LocationHistory" where "Date" > trunc(sysdate, 'DD') But I can't save this as "query": in Kafka Connector parameter…
andrewd76
  • 35
  • 5
1
vote
1 answer

Kafka-Connect For MSSQL Invalid value java.sql.SQLException: No suitable driver found for for configuration

I am trying to connect kafka-connect to my local mssql with localhost:3030 I am receiving this error when I try to make a new connection for mssql. in centos 7(linux). Mssql data is from an external IP(windows), my consumer is inside of linux…
newUser
  • 386
  • 5
  • 17
1
vote
1 answer

java.lang.NoClassDefFoundError: org/apache/http/nio/conn/SchemeIOSessionStrategy

I am getting the below error while I am trying to run kafka elasticsearch sink conect, I have verified and I can see the required jar is available at the plugin.path location. List of…
1
vote
1 answer

Looking for a kafka connector to upsert datas to elasticsearch

Is there any kafka connector that can handle this kind of requests please ? I receive datas in a kafka topic in this format (Number of rows inside the JSON are random): { "1574922337":[{"price": 1, "product": 2], "1574922338":[{"price": 13,…