I have this scenario where I need to fetch millions of records from an Oracle database and then need to send these records in a chunk of 1000 to an Apache Kafka producer.
While fetching the records the next time, I have to avoid pulling the already pushed records to Kafka, and select the updated records instead. It's a form of delta load processing,
Please let me know if there is any approach for this scenario that I should follow.