0

With respect to the link - Kafka transaction rollback not working with 3 topics for RecordTooLargeException

On EDIT3 I have below question

how to send error to DB and at the same time to DLQ while using AfterRollbackProcessor

I added addNotRetryableExceptions ( RecordTooLargeException , IllegalArgumentException, CustomBusinessException) to the DefaultAfterRollbackProcessor

After recovery phase ( save error to db and send to DLQ) - if re-balancing or restart happens to the code - code again retries the Failed record ( RecordTooLargeException ) - how to skip NotRetryableExceptions error for further try

 @Bean
 AfterRollbackProcessor<Object, Object> arp() {
     DefaultAfterRollbackProcessor darp = new DefaultAfterRollbackProcessor<>((rec, ex) -> {
         log.error("#### Failed to process {} from topic, partition {}-{}, @{}",
                 rec.value(), rec.topic(), rec.partition(), rec.offset(), ex);
         // If the exception is RetryableExceptions then tell kafka do not send that message if code restarted
     }, new FixedBackOff(3000L, 2));

     Class<? extends Exception>[] nre = new Class[2];
     nre[0] = RecordTooLargeException.class;
     nre[1] = IllegalArgumentException.class;
     darp.addNotRetryableExceptions(nre);

     return darp;
 }

As per suggestion - I updated the code as below


   @Autowired
    private KafkaTemplate<String, String> kafkaTemplate;

    @Autowired
    private DBHandler dbHandler;

    @Bean
    AfterRollbackProcessor<Object, Object> arp() {
        DefaultAfterRollbackProcessor darp = new DefaultAfterRollbackProcessor<>((rec, ex) -> {
            log.error("#### Failed to process {} from topic, partition {}-{}, @{}",
                    rec.value(), rec.topic(), rec.partition(), rec.offset(), ex);

            // Save records to DB
            dbHandler.handleFailure((String)rec.key(), (String)rec.value(), ex, rec.topic());
            
// want to send Data to DLQ - How to do 


        }, new FixedBackOff(3000L, 3),kafkaTemplate, true);

        Class<? extends Exception>[] nre = new Class[2];
        nre[0] = RecordTooLargeException.class;
        nre[1] = IllegalArgumentException.class;
        darp.addNotRetryableExceptions(nre);
        return darp;
    }

Some how able to find a solution Solution

// Created below class for dump error record to DB while in recovery phase

@Slf4j
@Service
public class DBPublishingRecordRecoverer implements ConsumerRecordRecoverer {
    @Override
    public void accept(ConsumerRecord<?, ?> rec, Exception ex) {
        log.error("@ DB Operation |  process {} from topic, partition {}-{}, @{}",
                rec.value(), rec.topic(), rec.partition(), rec.offset(), ex.getMessage());
    }
}

created a class who send the same failed record to the DLT


@Slf4j
@Service
public class DLTRecordRecoverer  {

    public DeadLetterPublishingRecoverer dlr(@Nullable KafkaOperations<?, ?> kafkaOperations) {
        return new DeadLetterPublishingRecoverer(kafkaOperations) {

            @Override
            public void accept(ConsumerRecord<?, ?> record, Exception exception) {
                log.info("DLQ to process {} from topic, partition {}-{}, @{}",
                        record.value(), record.topic(), record.partition(), record.offset(), exception.getMessage());
                super.accept(record, exception);
            }

        };
    }
}

Now add these 2 recoverers to the AfterRollbackProcessor


   @Bean
    AfterRollbackProcessor<Object, Object> xyz() {
        DefaultAfterRollbackProcessor darp = new DefaultAfterRollbackProcessor<>(testRecoverer
                .andThen(dltRecordRecoverer.dlr(kafkaTemplate)),
                new FixedBackOff(3000L, 3), kafkaTemplate, true);

        Class<? extends Exception>[] nre = new Class[2];
        nre[0] = RecordTooLargeException.class;
        nre[1] = IllegalArgumentException.class;
        darp.addNotRetryableExceptions(nre);
        return darp;
    }

The log output

c.t.t.demo.DBPublishingRecordRecoverer   : @ DB Operation |  process Another example from topic, partition TEST-TOPIC-2, @20
c.t.transaction.demo.DLTRecordRecoverer  : DLQ to process Another example from topic, partition TEST-TOPIC-2, @20
o.a.k.c.p.internals.TransactionManager   : [Producer clientId=raw-item-producer-client-1, transactionalId=tx-01d1a934-3c0e-45b4-ac1f-5b8fa

In the consumer code

KafkaMessageListenerContainer    : aem-dam-edm-group-id: partitions assigned: [PRICE-TOPIC-0, PRICE-TOPIC-1, PRICE-TOPIC-2]
KafkaMessageListenerContainer    : aem-dam-edm-group-id: partitions assigned: [ITEM-TOPIC-1, ITEM-TOPIC-2, ITEM-TOPIC-0]
KafkaMessageListenerContainer    : aem-dam-edm-group-id: partitions assigned: [INVENTORY-TOPIC-1, INVENTORY-TOPIC-0, INVENTORY-TOPIC-2]
KafkaMessageListenerContainer    : aem-dam-edm-group-id: partitions assigned: [TEST-TOPIC.DLT-1, TEST-TOPIC.DLT-0, TEST-TOPIC.DLT-2]
ransaction.demo.ConsumerService  : Received payload. Topic : TEST-TOPIC.DLT , key :TestKey-002 , value : Another example
user3575226
  • 111
  • 5
  • 15

1 Answers1

0

In order to commit the offset of the recovered transaction, you have to pass a transactional KafkaTemplate into the DefaultAfterRollbackProcessor and set commitRecovered to true. See the javadocs

/**
 * Construct an instance with the provided recoverer which will be called after the
 * backOff returns STOP for a topic/partition/offset.
 * @param recoverer the recoverer; if null, the default (logging) recoverer is used.
 * @param backOff the {@link BackOff}.
 * @param kafkaOperations for sending the recovered offset to the transaction.
 * @param commitRecovered true to commit the recovered record's offset; requires a
 * {@link KafkaOperations}.
 * @since 2.5.3
 */
Gary Russell
  • 166,535
  • 14
  • 146
  • 179
  • passed the KafkaTemplate and commitRecovered =true to the DefaultAfterRollbackProcessor as suggested. But how to send it to DLQ - ? do i need to write a custom utility to send it to DLQ or there is a smatter way along with DB dump – user3575226 Mar 28 '22 at 15:23
  • Create a `ConsumerRecordRecoverer` that calls your dbHandler and also calls a `DeadLetterPublishingRecoverer`. https://docs.spring.io/spring-kafka/docs/current/reference/html/#dead-letters – Gary Russell Mar 28 '22 at 16:04
  • any sample code /poc code link for ConsumerRecordRecoverer will be extremely helpful – user3575226 Mar 28 '22 at 16:15
  • I updated the solution in the question itself. - can you please verify if it follows the best practice – user3575226 Mar 28 '22 at 17:49
  • It looks ok to me. – Gary Russell Mar 28 '22 at 20:09