0

I have 2 Spring Boot apps, one is Kafka publisher and the other is consumer. I am trying to write an integration test to make sure that events are sent and received.

The test is green when run in IDE or from command line without other tests, like mvn test -Dtest=KafkaPublisherTest. However, when I build the whole project, the test fails with org.awaitility.core.ConditionTimeoutException. There are multiple @EmbeddedKafka tests in the project.

The test gets stuck after these lines in logs:

2021-11-30 09:17:12.366  INFO 1437 --- [ntainer#0-0-C-1] o.s.k.l.KafkaMessageListenerContainer    : wages-consumer-test: partitions assigned: [wages-test-0, wages-test-1]
2021-11-30 09:17:14.464  INFO 1437 --- [er-event-thread] kafka.controller.KafkaController         : [Controller id=0] Processing automatic preferred replica leader election

If you have a better idea on how to test such things, please share.

Here is how the test looks like:

@SpringBootTest(properties = { "kafka.wages-topic.bootstrap-address=${spring.embedded.kafka.brokers}" })
@EmbeddedKafka(partitions = 1, topics = "${kafka.wages-topic.name}")
class KafkaPublisherTest {

    @Autowired
    private TestWageProcessor testWageProcessor;

    @Autowired
    private KafkaPublisher kafkaPublisher;

    @Autowired
    private KafkaTemplate<String, WageEvent> kafkaTemplate;

    @Test
    void publish() {
        Date date = new Date();
        WageCreateDto wageCreateDto = new WageCreateDto().setName("test").setSurname("test").setWage(BigDecimal.ONE).setEventTime(date);
        kafkaPublisher.publish(wageCreateDto);

        kafkaTemplate.flush();
        WageEvent expected = new WageEvent().setName("test").setSurname("test").setWage(BigDecimal.ONE).setEventTimeMillis(date.toInstant().toEpochMilli());

        await()
                .atLeast(Duration.ONE_HUNDRED_MILLISECONDS)
                .atMost(Duration.TEN_SECONDS)
                .with()
                .pollInterval(Duration.ONE_HUNDRED_MILLISECONDS)
                .until(testWageProcessor::getLastReceivedWageEvent, equalTo(expected));
    }
}

Publisher config:

@Configuration
@EnableConfigurationProperties(WagesTopicPublisherProperties.class)
public class KafkaConfiguration {

    @Bean
    public KafkaAdmin kafkaAdmin(WagesTopicPublisherProperties wagesTopicPublisherProperties) {
        Map<String, Object> configs = new HashMap<>();
        configs.put(AdminClientConfig.BOOTSTRAP_SERVERS_CONFIG, wagesTopicPublisherProperties.getBootstrapAddress());
        return new KafkaAdmin(configs);
    }

    @Bean
    public NewTopic wagesTopic(WagesTopicPublisherProperties wagesTopicPublisherProperties) {
        return new NewTopic(wagesTopicPublisherProperties.getName(), wagesTopicPublisherProperties.getPartitions(), wagesTopicPublisherProperties.getReplicationFactor());
    }

    @Primary
    @Bean
    public WageEventSerde wageEventSerde() {
        return new WageEventSerde();
    }

    @Bean
    public ProducerFactory<String, WageEvent> producerFactory(WagesTopicPublisherProperties wagesTopicPublisherProperties, WageEventSerde wageEventSerde) {
        Map<String, Object> configProps = new HashMap<>();
        configProps.put(
                ProducerConfig.BOOTSTRAP_SERVERS_CONFIG,
                wagesTopicPublisherProperties.getBootstrapAddress());
        configProps.put(
                ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG,
                StringSerializer.class);
        configProps.put(
                ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG,
                wageEventSerde.serializer().getClass());
        return new DefaultKafkaProducerFactory<>(configProps);
    }

    @Bean
    public KafkaTemplate<String, WageEvent> kafkaTemplate(ProducerFactory<String, WageEvent> producerFactory) {
        return new KafkaTemplate<>(producerFactory);
    }
}

Consumer config:

@Configuration
@EnableConfigurationProperties(WagesTopicConsumerProperties.class)
public class ConsumerConfiguration {

    @ConditionalOnMissingBean(WageEventSerde.class)
    @Bean
    public WageEventSerde wageEventSerde() {
        return new WageEventSerde();
    }

    @Bean
    public ConsumerFactory<String, WageEvent> wageConsumerFactory(WagesTopicConsumerProperties wagesTopicConsumerProperties, WageEventSerde wageEventSerde) {
        Map<String, Object> props = new HashMap<>();
        props.put(
                ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG,
                wagesTopicConsumerProperties.getBootstrapAddress());
        props.put(
                ConsumerConfig.GROUP_ID_CONFIG,
                wagesTopicConsumerProperties.getGroupId());
        props.put(
                ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG,
                StringDeserializer.class);
        props.put(
                ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG,
                wageEventSerde.deserializer().getClass());
        return new DefaultKafkaConsumerFactory<>(
                props,
                new StringDeserializer(),
                wageEventSerde.deserializer());
    }

    @Bean
    public ConcurrentKafkaListenerContainerFactory<String, WageEvent> wageEventConcurrentKafkaListenerContainerFactory(ConsumerFactory<String, WageEvent> wageConsumerFactory) {

        ConcurrentKafkaListenerContainerFactory<String, WageEvent> factory = new ConcurrentKafkaListenerContainerFactory<>();
        factory.setConsumerFactory(wageConsumerFactory);
        return factory;
    }
}
    @KafkaListener(
            topics = "${kafka.wages-topic.name}",
            containerFactory = "wageEventConcurrentKafkaListenerContainerFactory")
    public void consumeWage(WageEvent wageEvent) {
        log.info("Wage event received: " + wageEvent);
        wageProcessor.process(wageEvent);
    }

Here is the project source code: https://github.com/aleksei17/springboot-rest-kafka-mysql

Here are the logs of failed build: https://drive.google.com/file/d/1uE2w8rmJhJy35s4UJXf4_ON3hs9JR6Au/view?usp=sharing

Aleksei
  • 137
  • 1
  • 12
  • 1
    It looks like you are starting multiple Boot applications and multiple brokers; consider using a single broker instance for all tests - https://docs.spring.io/spring-kafka/docs/current/reference/html/#using-the-same-brokers-for-multiple-test-classes – Gary Russell Nov 29 '21 at 16:41
  • @GaryRussell, I did not know about such option, thank you! It did not work for my case though, here is the code: https://github.com/aleksei17/springboot-rest-kafka-mysql/commit/d76952f3349ada827ad1a19d10bdf0006078f79f So I will check if using Spring Cloud Stream would make the code more testable – Aleksei Nov 30 '21 at 07:22

1 Answers1

0

When I used Testcontainers Kafka instead of @EmbeddedKafka, the issue was solved. The tests looked like this:

@SpringBootTest(webEnvironment = SpringBootTest.WebEnvironment.RANDOM_PORT)
class PublisherApplicationTest {

    public static final KafkaContainer kafka =
            new KafkaContainer(DockerImageName.parse("confluentinc/cp-kafka").withTag("5.4.3"));

    static {
        kafka.start();
        System.setProperty("kafka.wages-topic.bootstrap-address", kafka.getBootstrapServers());
    }

However, I could not say I understand the issue. When I used a singleton pattern as described here, I had the same issue. Maybe something like @DirtiesContext could help: it helped to fix one test at work, but not in this learning project.

Aleksei
  • 137
  • 1
  • 12