I have a use case in which I receive my input data in a Kafka topic. This data has to be transformed and enriched and sent to a PostgreSQL table. My application is using Quarkus Kafka streams to consume data from the input topic and process them. However I do not find any sink/connector to any database (in my case PostgreSQL). If I am not wrong, we can only send the transformed result from a kafkaStream (KStream) to another Kafka topic or a KTable. Then use another service such as kafkaConnect or Flink to read the output topic and write the data into the target postgres table.
Is there a way to directly persist the data from Kstream to a PostgreSQL table? As it is a streaming application I do not want to hit the DB for each message and would like to batch insert the data to the table.
Thanks a lot in advance for any pointers.