I made a Kafka producer using kafka-python to send records to a remote broker. If I have a problem of network connection during more than request_timeout_ms (here 20s), the callback sends me this exception :
KafkaTimeoutError: Batch for TopicPartition(topic='first_topic', partition=0) containing 117 record(s) expired: 20 seconds have passed since last append
I don't want to have a too high value of timeout and would like to write data expired into a local storage. The objective after that will be to make a producer consuming records into this storage (if not empty) and send them to the remote if/when it can. The records are critical that's why I would like to store it instead of deleting it. No matter if the order is respected. So how can we tell the producer not to delete expired records and write it somewhere before deleting them ?
Thanks for your help