2

I am having Kafka Consumers in a Spring Boot application. I have kept ConsumerConfig.ENABLE_AUTO_COMMIT_CONFIG as false and my consumers are manually acknowledging the messages. Spring-Kafka: 2.2.11.RELEASE

My configuration:

 @Override
  public Map<String, Object> consumerConfig() {
    Map<String, Object> props = new HashMap<>();
    props.put(CommonClientConfigs.SECURITY_PROTOCOL_CONFIG, securityProtocol);
    props.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest");
    props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapServers);
    props.put(ConsumerConfig.ENABLE_AUTO_COMMIT_CONFIG, false);
    props.put(ConsumerConfig.HEARTBEAT_INTERVAL_MS_CONFIG, heartbeatInterval);
    props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, ErrorHandlingDeserializer2.class);
    props.put(ConsumerConfig.MAX_POLL_INTERVAL_MS_CONFIG, maxPollIntervalMs);
    props.put(ConsumerConfig.SESSION_TIMEOUT_MS_CONFIG, sessionTimeout);
    props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, ErrorHandlingDeserializer2.class);
    props.put(ErrorHandlingDeserializer2.KEY_DESERIALIZER_CLASS, KafkaAvroDeserializer.class);
    props.put(ErrorHandlingDeserializer2.VALUE_DESERIALIZER_CLASS, KafkaAvroDeserializer.class);
    props.put(KafkaAvroDeserializerConfig.SCHEMA_REGISTRY_URL_CONFIG, schemaRegistryServers);

    return props;
  }

Connection Factory

 ConcurrentKafkaListenerContainerFactory<K, V> kvConcurrentKafkaListenerContainerFactory =
        new ConcurrentKafkaListenerContainerFactory<>();
    kvConcurrentKafkaListenerContainerFactory.setConsumerFactory(new DefaultKafkaConsumerFactory<>(props, getAvroKeyDeserializer(),
                                             getAvroValueDeserializer());
    kvConcurrentKafkaListenerContainerFactory.getContainerProperties()
                                             .setAckOnError(false);
    kvConcurrentKafkaListenerContainerFactory.getContainerProperties()
                                             .setAckMode(
                                                 ContainerProperties.AckMode.MANUAL_IMMEDIATE);

Kafka Consumer:

@KafkaListener(topics = "${topic-name}", groupId = "${group-id}", containerFactory = CONTAINER_FACTORY)
  public void consume(ConsumerRecord<Key, Envelope> record, Acknowledgment acknowledgment) {
    final Envelope envelope = record.value();
    if(//some condition){
         //logic
    }
    acknowledgment.acknowledge();
  }

The issue is offset is lost if the application crashes at If statement.

My understanding is if 'acknowledgment.acknowledge();' is not done and application crashes then on restart the same message should be processed again.

I need help to understand what I am doing wrong here.

JDev
  • 1,662
  • 4
  • 25
  • 55
  • What do you mean by "application crashes"? If the JVM is shut down, the record will be redelivered. If you mean an exception is thrown and you are using a version older than 2.5.x (Boot 2.3), you can configure a `SeekToCurrentErrorHandler` so, if the listener throws an exception, the record will be redelivered. With 2.5, that's the default error handler. With earlier versions, the error would be logged and the consumer moves on. – Gary Russell Jun 03 '20 at 01:34
  • @GaryRussell Huge respect for you :) I am using spring-kafka 2.2.11 RELEASE. I am testing out things in local and for this scenario, I have put a debug point at If statement, and when control comes there I am forcefully shutdown the JVM. On the restart, I am expecting the same message as offset was not committed. – JDev Jun 03 '20 at 01:43
  • Given that scenario, you should see the record redelivered because the offset will not be committed until you call `acknowledge()`. If that's not what you are seeing, something else is going on. – Gary Russell Jun 03 '20 at 02:22

0 Answers0