1

I have a @KafkaListener method to get all messages in topic but I only get one message for each interval time that @Scheduled method works. How can I get all messages from topic in once?

Here's my class;

@Slf4j
@Service
public class KafkaConsumerServiceImpl implements KafkaConsumerService {

    @Autowired
    private SimpMessagingTemplate webSocket;

    @Autowired
    private KafkaListenerEndpointRegistry registry;

    @Autowired
    private BrokerProducerService brokerProducerService;

    @Autowired
    private GlobalConfig globalConfig;

    @Override
    @KafkaListener(id = "snapshotOfOutagesId", topics = Constants.KAFKA_TOPIC, groupId = "snapshotOfOutages", autoStartup = "false")
    public void consumeToSnapshot(ConsumerRecord<String, OutageDTO> cr, @Payload String content) {
        log.info("Received content from Kafka notification to notification-snapshot topic: {}", content);
        MessageListenerContainer listenerContainer = registry.getListenerContainer("snapshotOfOutagesId");
        JSONObject jsonObject= new JSONObject(content);
        Map<String, Object> outageMap = jsonToMap(jsonObject);
        brokerProducerService.sendMessage(globalConfig.getTopicProperties().getSnapshotTopicName(),
                outageMap.get("outageId").toString(), toJson(outageMap));
        listenerContainer.stop();
    }

    @Scheduled(initialDelayString = "${scheduler.kafka.snapshot.monitoring}",fixedRateString = "${scheduler.kafka.snapshot.monitoring}")
    private void consumeWithScheduler() {
        MessageListenerContainer listenerContainer = registry.getListenerContainer("snapshotOfOutagesId");
        if (listenerContainer != null){
            listenerContainer.start();
        }
    }

And here's my kafka properties in application.yml;

kafka:
  streams:
    common:
      configs:
        "[bootstrap.servers]": 192.168.99.100:9092
        "[client.id]": event
        "[producer.id]": event-producer
        "[max.poll.interval.ms]": 300000
        "[group.max.session.timeout.ms]": 300000
        "[session.timeout.ms]": 200000
        "[auto.commit.interval.ms]": 1000
        "[auto.offset.reset]": latest
        "[group.id]": event-consumer-group
        "[max.poll.records]": 1

And also my KafkaConfiguration class;

    @Bean
    public Map<String, Object> consumerConfigs() {
        Map<String, Object> props = new HashMap<>(globalConfig.getBrokerProperties().getConfigs());
        props.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "latest");
        return props;
    }

    @Bean
    public ConsumerFactory<String, String> consumerFactory() {
        return new DefaultKafkaConsumerFactory<>(consumerConfigs(), new StringDeserializer(), new StringDeserializer());
    }

    @Bean
    public KafkaListenerContainerFactory<ConcurrentMessageListenerContainer<String, String>> kafkaListenerContainerFactory() {
        ConcurrentKafkaListenerContainerFactory<String, String> factory = new ConcurrentKafkaListenerContainerFactory<>();
        factory.setConsumerFactory(consumerFactory());
        return factory;
    }
Burak
  • 133
  • 2
  • 12

1 Answers1

0

What you're currently doing is:

  1. Create a listener but don't start it yet (autoStartup = false)
  2. When the scheduled job kicks in, start the container (will start consuming the first message from the topic)
  3. When the first message is consumed, you stop the container (resulting in no messages being consumed anymore)

So indeed the behavior you are describing is not a surprise.

@KafkaListener doesn't need a scheduled task to have start consuming messages. I think you can remove autoStartup = false and remove the scheduled job, after which the listener will consume all messages on the topic one by one, and wait for new ones when they appear on the topic.

Also, some other things I noticed:

The properties are for Kafka Streams, for regular Spring Kafka you need the properties like so:

spring:
  kafka:
    bootstrap-servers: localhost:9092
    consumer:
      auto-offset-reset: earliest
      ...etc

Also: why use @Payload String content instead of the already serialized cr.getVaue()?

moffeltje
  • 4,521
  • 4
  • 33
  • 57
  • Thanks for your detailed answer @moffeltje. How can I give KafkaListener to work once in 15 minutes if I delete Scheduled annotation? And what should I do to get all messages in once rather than getting them one by one? I, by the way, already delete String content before your message as I can get the value as you mentioned with cr.getValue(). – Burak Aug 26 '21 at 11:35
  • Why would you want to read all messages every 15 minutes? It does not sound like a proper event streaming architecture to me. You can take a look at batch consumer in Spring Kafka documentation, but I'm not sure it can do exactly what you want. – moffeltje Aug 27 '21 at 09:35