I have an application that is using Kafka to Synchronize data between instances, therefore it both produces and consumes data from Kafka, additionally the application is consuming a Kafka Topic and transforming and streaming that data into another topic for clients to consume.
My Application has two Clusters for failover. Going through the Kafka Documentation I found this
https://docs.spring.io/spring-kafka/docs/current/reference/html/#connecting that talks about ABSwitchCluster
.
How can I use ABSwitchCluster to Failover Automagically if the Kafka Cluster goes down, for both KafkaTemplate.send()
and @KafkaListener
annotated methods?
Update with More Info
I've added some error Handlers for KafkaTemplate.send and Kafka Consumer Events NonResponsiveConsumerEvent
and ListenerContainerIdleEvent
Ultimately they call a shared Method to switch, and a BeanPostProcessor
is used to actually add the ABSwitchCluster
to KafkaResourceFactory
Beans.
The Switch over code looks like so:
@Autowired
KafkaSwitchCluster kafkaSwitchCluster;
@Autowired
WebApplicationContext context;
@Autowired
KafkaListenerEndpointRegistry registry;
/**
* Unable to use {@link Autowired} due to circular dependency
* with {@link KafkaPostProcessor}
* @return
*/
public DefaultKafkaProducerFactory getDefaultKafkaProducerFactory()
{ return context.getBean(DefaultKafkaProducerFactory.class); }
/** Back-End Method to Actually Switch between the clusters */
private void switchCluster()
{
if (kafkaSwitchCluster.isPrimary()) { kafkaSwitchCluster.secondary(); }
else { kafkaSwitchCluster.primary(); }
getDefaultKafkaProducerFactory().reset();
registry.stop();
registry.destroy();
registry.start();
for(MessageListenerContainer listener : registry.getListenerContainers() )
{
listener.stop();
listener.start();
}
}
Given the Updates Above when Looking in the Test Logs, it appears that the Producer is correctly, switching clusters, but my consumers are not.
So how can I get the @KafkaListener
consumers to switch?