We are trying to achieve java KafkaProducer resiliency where in case Kafka is not reachable for some reason then we are storing failed messages in files and retrying once kafka comes up. But we are checking periodically to see if kafka is available are not by trying to creating KafkaProducer object every 1 mins we observed that Too many files open issue is coming up. Where when we observed there are sockets getting open which are closing though we call Producer close function. Since this process runs as a daemon, how to release the sockets from Producer so that we dont get into this issue.
We get the log message
[org.apache.kafka.common.utils.KafkaThread] (kafka-admin-client-thread | consumer-id7035)
Uncaught exception in thread 'kafka-admin-client-thread | consumer-id7035'
each time the kafkaproducer object is created. I need help in figuring out how to close cleanly when we keep trying to create new KafkaProducer objects to attemp resending of messages when kakfa is down
I tried using AdminClient to see if kafka is available or not, but even that caused the same isssue.
If Producer not able to connect to kafka it should comeout cleanly without holding sockets or any resources that it could have held.