I have a topic with two partitions,i'm using @RetryableTopic and i can see logs on my app console that INFO o.s.k.r.RetryTopicConfigurer - Received message in dlt listener: {topic name with second partition} and this is wrong because it's just another partition not dlt topic, how can i hide those logs or avoid them?
@RetryableTopic(
attempts = "1",
backoff = @Backoff(delay = 100, multiplier = 3.0),
autoCreateTopics = "false",
topicSuffixingStrategy = TopicSuffixingStrategy.SUFFIX_WITH_INDEX_VALUE, numPartitions = "2")
@KafkaListener(id= "ccn2_listener",topics = "test", groupId = "test", autoStartup = "${listen.auto.start:true}", topicPartitions = { @TopicPartition(topic = "ccn2-bam-raw-data", partitions = {"1"})})
public void listen(ConsumerRecord<String, String> consumerRecord, Acknowledgment acknowledgment) throws IOException, InterruptedException {
log.info(consumerRecord.key());
log.info(consumerRecord.value());
{some working with code}
if(some objectives) {
throw new RodaTableMappingException("Problem with mapping Kafka record, sending it on dlt topic");
application properties file:
elastic.apm.enabled=true
elastic.apm.server-url=url
elastic.apm.service-name=name
elastic.apm.secret-token=token
elastic.apm.environment=prod
elastic.apm.application-packages=package
elastic.apm.log-level=INFO
apminsight.console.logger=true
And i can see logs like that in my console:
2023-01-24 02:16:08,824 [topic_listener-dlt-0-C-1] INFO
o.s.k.r.RetryTopicConfigurer - Received message in dlt listener: topic-
1@38262. And I'm running two instances of this app with different partitions 0 and 1, but same group and topic. And i understand that these logs are because message is received on another partition, but how to avoid them logs
ConsumerConfiguration:
@Bean
public ConsumerFactory<String, String> consumerFactory() {
return new DefaultKafkaConsumerFactory<>(consumerConfigurations());
}
@Bean
public Map<String, Object> consumerConfigurations() {
Map<String, Object> configurations = new HashMap<>();
configurations.put(StreamsConfig.BOOTSTRAP_SERVERS_CONFIG, kafkaBroker0);
configurations.put(StreamsConfig.DEFAULT_KEY_SERDE_CLASS_CONFIG, Serdes.String().getClass());
configurations.put(StreamsConfig.DEFAULT_VALUE_SERDE_CLASS_CONFIG, Serdes.String().getClass());
configurations.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest");
configurations.put("ssl.truststore.location", truststoreLocation);
configurations.put("ssl.truststore.password", truststorePassword);
configurations.put("security.protocol", "SSL");
configurations.put("ssl.keystore.location", keystoreLocation);
configurations.put("ssl.keystore.password", keyPassword);
configurations.put("ssl.key.password", keyPassword);
configurations.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
configurations.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
configurations.put(ConsumerConfig.ENABLE_AUTO_COMMIT_CONFIG, false);
return configurations;
}
@Bean
ConcurrentKafkaListenerContainerFactory<String, String> kafkaListenerContainerFactory() {
ConcurrentKafkaListenerContainerFactory<String, String> factory = new ConcurrentKafkaListenerContainerFactory<>();
factory.setConsumerFactory(consumerFactory());
factory.getContainerProperties().setAckMode(ContainerProperties.AckMode.MANUAL_IMMEDIATE);
return factory;
}
Producer configuration:
@Bean
public Map<String, Object> producerConfigs() {
Map<String, Object> props = new HashMap<>();
props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG,
bootstrapServers);
props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG,
StringSerializer.class);
props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG,
StringSerializer.class);
props.put("security.protocol", "SSL");
props.put("ssl.truststore.location", truststoreLocation);
props.put("ssl.truststore.password", truststorePassword);
props.put("ssl.keystore.location", keystoreLocation);
props.put("ssl.keystore.password", keyPassword);
props.put("ssl.key.password", keyPassword);
return props;
}
@Bean
public ProducerFactory<String, String> producerFactory() {
return new DefaultKafkaProducerFactory<>(producerConfigs());
}
@Bean
public KafkaTemplate<String, String> kafkaTemplate() {
return new KafkaTemplate<>(producerFactory());
}