0

I have a use case where I have to push all my MySQL database data to a Kafka topic. Now, I know I can get this up and running using a Kafka connector, but I want to understand how it all works internally without using a connector. In my spring boot project I already have created a Kafka Producer file where I set all my configuration, create a Producer record and so on.

Has anyone tried this approach before? Can anyone throw some light on this?

Nithin Prasad
  • 554
  • 1
  • 9
  • 22
  • Kafka Connect is specifically designed for doing this. Can you explain why you don't want to use it? Because it sounds like you want to reinvent the wheel ;-) – Robin Moffatt Sep 04 '19 at 16:20
  • @RobinMoffatt You're right. I just want to know the insights of the whole framework that's behind. Please lead me through some examples. – Nithin Prasad Sep 04 '19 at 16:26
  • Also, I am using protobuf schema as a message format for data serialization/deserialization. I am able to compile my .proto files to generate the .java classes. The only place I am stuck is to get the data from the database. Some help would be appreciated.:) – Nithin Prasad Sep 04 '19 at 16:44
  • Debezium doesn't use JDBC. It reads the Mysql binlog, then serializes into a Kafka message... If you did want to use JDBC, then just put a select statement in a repeated loop, and you'd be close – OneCricketeer Sep 04 '19 at 23:33

1 Answers1

0

Create entity using spring jpa for tables and send data to topic using find all. Use scheduler for fetching data and sending it to topic. You can add your own logic for fetching from DB and also a different logic for sending it to Kafka topic. Like fetch using auto increment, fetch using last updated timestamp or a bulk fetch. Same logic of JDBC connectors can be implemented.

Kakfa Connect will do it in an optimized way.

arun_hareesh
  • 68
  • 10