0

I am new to confluent-kafka-python i am running kafka using docker-compose from cp-all-in-one. I created a producer producing data to a new topic same as the offical tutorial.

but I can not find any information regarding how to set up the schema or connection to the database in the code.

I am trying to save the produced data into a table in database so later I can load it on the UI. now every time I refresh the Confluent Control Center topic page it seems I lose the data. Is there any tutorial on this? or am I using Kafka incorrectly.

sam
  • 335
  • 2
  • 5
  • 19

1 Answers1

1

If you send data with the AvroProducer, a schema gets registered for you.

There is no database in those links, though, so unclear what you mean there.

If by "topic UI", you mean Confluent Control Center, viewing messages is documented here

OneCricketeer
  • 179,855
  • 19
  • 132
  • 245
  • Thanks, links helped a lot. I though Kafka have some framework for pushing the data to database. I need the data both live and also need to store some of the data for later use. I am looking for some connection to a SQL database to store the data. – sam Dec 25 '20 at 16:18
  • Yes, it's called Kafka Connect, which is a Java Framework. There's JDBC connector tutorials on Confluent site – OneCricketeer Dec 25 '20 at 17:26
  • thanks that is helpful but is there any tutorial on setting up mysql database as a sink, the information on confluent are very confusing. – sam Jan 08 '21 at 16:22
  • Try https://dev.to/rmoff/streaming-data-from-kafka-to-a-database-video-walkthrough-4o5p – OneCricketeer Jan 08 '21 at 16:55