I have configured 3 brokers in Kafka running on different ports .I am using spring cloud stream kafka
brokers: localhost:9092,localhost:9093,localhost:9094.
I am creating a data pipeline that gets continuous stream of data .I am storing stream of data in kafka topic with 3 brokers running .Till now there is no problem .My concern is suppose say 3 brokers went down for 5 minutes then at that time i am unable to get data on kafka topic .There will be data loss for 5 minutes .From spring boot i will get warning
2020-10-06 11:44:20.840 WARN 2906 --- [ad | producer-2] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-2] Connection to node 0 (/192.168.1.78:9092) could not be established. Broker may not be available.
Is there a way to store data temporary when all brokers goes down and again start to resume writing to topic from a temporary storage when brokers are up again ?