I have 2 Spring Boot apps, one is Kafka publisher and the other is consumer. I am trying to write an integration test to make sure that events are sent and received.
The test is green when run in IDE or from command line without other tests, like mvn test -Dtest=KafkaPublisherTest
. However, when I build the whole project, the test fails with org.awaitility.core.ConditionTimeoutException
. There are multiple @EmbeddedKafka tests in the project.
The test gets stuck after these lines in logs:
2021-11-30 09:17:12.366 INFO 1437 --- [ntainer#0-0-C-1] o.s.k.l.KafkaMessageListenerContainer : wages-consumer-test: partitions assigned: [wages-test-0, wages-test-1]
2021-11-30 09:17:14.464 INFO 1437 --- [er-event-thread] kafka.controller.KafkaController : [Controller id=0] Processing automatic preferred replica leader election
If you have a better idea on how to test such things, please share.
Here is how the test looks like:
@SpringBootTest(properties = { "kafka.wages-topic.bootstrap-address=${spring.embedded.kafka.brokers}" })
@EmbeddedKafka(partitions = 1, topics = "${kafka.wages-topic.name}")
class KafkaPublisherTest {
@Autowired
private TestWageProcessor testWageProcessor;
@Autowired
private KafkaPublisher kafkaPublisher;
@Autowired
private KafkaTemplate<String, WageEvent> kafkaTemplate;
@Test
void publish() {
Date date = new Date();
WageCreateDto wageCreateDto = new WageCreateDto().setName("test").setSurname("test").setWage(BigDecimal.ONE).setEventTime(date);
kafkaPublisher.publish(wageCreateDto);
kafkaTemplate.flush();
WageEvent expected = new WageEvent().setName("test").setSurname("test").setWage(BigDecimal.ONE).setEventTimeMillis(date.toInstant().toEpochMilli());
await()
.atLeast(Duration.ONE_HUNDRED_MILLISECONDS)
.atMost(Duration.TEN_SECONDS)
.with()
.pollInterval(Duration.ONE_HUNDRED_MILLISECONDS)
.until(testWageProcessor::getLastReceivedWageEvent, equalTo(expected));
}
}
Publisher config:
@Configuration
@EnableConfigurationProperties(WagesTopicPublisherProperties.class)
public class KafkaConfiguration {
@Bean
public KafkaAdmin kafkaAdmin(WagesTopicPublisherProperties wagesTopicPublisherProperties) {
Map<String, Object> configs = new HashMap<>();
configs.put(AdminClientConfig.BOOTSTRAP_SERVERS_CONFIG, wagesTopicPublisherProperties.getBootstrapAddress());
return new KafkaAdmin(configs);
}
@Bean
public NewTopic wagesTopic(WagesTopicPublisherProperties wagesTopicPublisherProperties) {
return new NewTopic(wagesTopicPublisherProperties.getName(), wagesTopicPublisherProperties.getPartitions(), wagesTopicPublisherProperties.getReplicationFactor());
}
@Primary
@Bean
public WageEventSerde wageEventSerde() {
return new WageEventSerde();
}
@Bean
public ProducerFactory<String, WageEvent> producerFactory(WagesTopicPublisherProperties wagesTopicPublisherProperties, WageEventSerde wageEventSerde) {
Map<String, Object> configProps = new HashMap<>();
configProps.put(
ProducerConfig.BOOTSTRAP_SERVERS_CONFIG,
wagesTopicPublisherProperties.getBootstrapAddress());
configProps.put(
ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG,
StringSerializer.class);
configProps.put(
ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG,
wageEventSerde.serializer().getClass());
return new DefaultKafkaProducerFactory<>(configProps);
}
@Bean
public KafkaTemplate<String, WageEvent> kafkaTemplate(ProducerFactory<String, WageEvent> producerFactory) {
return new KafkaTemplate<>(producerFactory);
}
}
Consumer config:
@Configuration
@EnableConfigurationProperties(WagesTopicConsumerProperties.class)
public class ConsumerConfiguration {
@ConditionalOnMissingBean(WageEventSerde.class)
@Bean
public WageEventSerde wageEventSerde() {
return new WageEventSerde();
}
@Bean
public ConsumerFactory<String, WageEvent> wageConsumerFactory(WagesTopicConsumerProperties wagesTopicConsumerProperties, WageEventSerde wageEventSerde) {
Map<String, Object> props = new HashMap<>();
props.put(
ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG,
wagesTopicConsumerProperties.getBootstrapAddress());
props.put(
ConsumerConfig.GROUP_ID_CONFIG,
wagesTopicConsumerProperties.getGroupId());
props.put(
ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG,
StringDeserializer.class);
props.put(
ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG,
wageEventSerde.deserializer().getClass());
return new DefaultKafkaConsumerFactory<>(
props,
new StringDeserializer(),
wageEventSerde.deserializer());
}
@Bean
public ConcurrentKafkaListenerContainerFactory<String, WageEvent> wageEventConcurrentKafkaListenerContainerFactory(ConsumerFactory<String, WageEvent> wageConsumerFactory) {
ConcurrentKafkaListenerContainerFactory<String, WageEvent> factory = new ConcurrentKafkaListenerContainerFactory<>();
factory.setConsumerFactory(wageConsumerFactory);
return factory;
}
}
@KafkaListener(
topics = "${kafka.wages-topic.name}",
containerFactory = "wageEventConcurrentKafkaListenerContainerFactory")
public void consumeWage(WageEvent wageEvent) {
log.info("Wage event received: " + wageEvent);
wageProcessor.process(wageEvent);
}
Here is the project source code: https://github.com/aleksei17/springboot-rest-kafka-mysql
Here are the logs of failed build: https://drive.google.com/file/d/1uE2w8rmJhJy35s4UJXf4_ON3hs9JR6Au/view?usp=sharing