0

Tracing information do not propagate over kafka messages due to the method SleuthKafkaAspect.wrapProducerFactory() is not triggered. On the producer side, the message is correctly sent and the tracing information is correctly logged. On consumer side, instead a new traceId and spanId is created.

The following two logging lines show different values for traceId,spanId (and parentId):

2021-03-23 11:42:30.158 [http-nio-9185-exec-2] INFO  my.company.Producer - /4afe07273872918b/4afe07273872918b// - Sending event='MyEvent'
2021-03-23 11:42:54.374 [org.springframework.kafka.KafkaListenerEndpointContainer#1-0-C-1] INFO my.company.Consumer /1fec3bf6a3c91773/ff4bd26b2e509ed8/1fec3bf6a3c91773/ - Received new event='MyEvent'

In first instance, using Krafdrop and also debugging, I verified that the message header doesn't contains any tracing information.

After that, I figured out that the method SleuthKafkaAspect.wrapProducerFactory() is never triggered, instead on consumer side the method SleuthKafkaAspect.anyConsumerFactory() is.

The libraries versions used are the following:

  • spring boot: 2.3.7.RELEASE
  • spring cloud bom: Hoxton.SR10
  • spring cloud: 2.2.7.RELEASE (and 2.2.5.RELEASE)
  • spring kafka: 2.5.10.RELEASE
  • kakfa client: 2.4.1
  • spring-cloud-starter-sleuth: 2.2.7.RELEASE
  • spring-cloud-sleuth-zipkin:2.2.7.RELEASE

The kakfa client library version is 2.4.1 is due to a version downgrade related to production bug on 2.5.1 version of kafka client that increase the cpu usage. I also tried to use the following libraries versions combination with no success:

  • spring boot: 2.3.7.RELEASE
  • spring cloud bom: Hoxton.SR10 (and Hoxton.SR8)
  • spring cloud: 2.2.7.RELEASE (and 2.2.5.RELEASE)
  • spring kafka: 2.5.10.RELEASE
  • kakfa client: 2.5.1
  • spring-cloud-starter-sleuth: 2.2.7.RELEASE (and 2.2.5.RELEASE)
  • spring-cloud-sleuth-zipkin:2.2.7.RELEASE (and 2.2.5.RELEASE)
  • spring boot: 2.3.7.RELEASE
  • spring cloud bom: Hoxton.SR10 (and Hoxton.SR8)
  • spring cloud: 2.2.7.RELEASE (and 2.2.5.RELEASE)
  • spring kafka: 2.5.10.RELEASE
  • kakfa client: 2.6.0
  • spring-cloud-starter-sleuth: 2.2.7.RELEASE (and 2.2.5.RELEASE)
  • spring-cloud-sleuth-zipkin:2.2.7.RELEASE (and 2.2.5.RELEASE)
  • spring boot: 2.3.7.RELEASE
  • spring cloud bom: Hoxton.SR10 (and Hoxton.SR8)
  • spring cloud: 2.2.7.RELEASE (and 2.2.5.RELEASE)
  • spring kafka: 2.6.x
  • kakfa client: 2.6.0
  • spring-cloud-starter-sleuth: 2.2.7.RELEASE (and 2.2.5.RELEASE)
  • spring-cloud-sleuth-zipkin:2.2.7.RELEASE (and 2.2.5.RELEASE)

We migrated our project to a different spring boot version, from 2.3.0.RELEASE to 2.3.7.RELEASE. Before everthing was working correctly. Below the old libraries versions:

  • spring-boot: 2.3.0.RELEASE
  • spring-kafka: 2.5.0.RELEASE
  • kafka-clients: 2.4.1
  • spring-cloud: 2.2.5.RELEASE
  • spring-cloud-starter-sleuth: 2.2.5.RELEASE
  • spring-cloud-sleuth-zipkin:2.2.5.RELEASE

We also introduced a log42/log4j (before it was slf4j with logback).

Below the related libraries:

- org.springframework.boot:spring-boot-starter-log4j2:jar:2.3.7.RELEASE:compile
- org.slf4j:jul-to-slf4j:jar:1.7.30:compile
- io.projectreactor:reactor-test:jar:3.3.12.RELEASE:test
- io.projectreactor:reactor-core:jar:3.3.12.RELEASE:test
- org.reactivestreams:reactive-streams:jar:1.0.3:test

The properties configured are the following:

spring.sleuth.messaging.enabled=true
spring.kafka.consumer.auto-offset-reset=latest
spring.kafka.consumer.enable-auto-commit=false
spring.kafka.bootstrap-servers=localhost:9092
spring.kafka.client-id=myClientIdentifier
spring.kafka.consumer.group-id=MyConsumerGroup
spring.kafka.producer.value-serializer=org.springframework.kafka.support.serializer.JsonSerializer

The configuration class for the ProducerFactory creation is the following:


@Configuration
@EnableTransactionManagement
public class KafkaProducerConfig {

    KafkaProperties kafkaProperties;

    @Autowired
    public KafkaProducerConfig(
            KafkaProperties kafkaProperties) {
        this.kafkaProperties = kafkaProperties;
    }

    @Bean
    public KafkaTemplate<String, Object> kafkaTemplate() {
        KafkaTemplate<String, Object> kafkaTemplate = new KafkaTemplate<>(producerFactory());
        return kafkaTemplate;
    }


    private ProducerFactory<String, Object> producerFactory() {
        DefaultKafkaProducerFactory<String, Object> defaultKafkaProducerFactory =
                new DefaultKafkaProducerFactory<>(producerConfigs());
        //defaultKafkaProducerFactory.transactionCapable();
        //defaultKafkaProducerFactory.setTransactionIdPrefix("tx-");
        return defaultKafkaProducerFactory;
    }

    private Map<String, Object> producerConfigs() {

        Map<String, Object> configs = kafkaProperties.buildProducerProperties();
        configs.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
        configs.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, JsonSerializer.class);
        return configs;
    }

}

My spring boot application class:


@Profile("DEV")
@SpringBootApplication(
        scanBasePackages = {"my.company"},
        exclude = {
                DataSourceAutoConfiguration.class,
                DataSourceTransactionManagerAutoConfiguration.class,
                HibernateJpaAutoConfiguration.class
        }
)
@EnableSwagger2
@EnableFeignClients(basePackages = {"my.company.common", "my.company.integration"})
@EnableTransactionManagement
@EnableMongoRepositories(basePackages = {
        "my.company.repository"})
@EnableMBeanExport(registration = RegistrationPolicy.IGNORE_EXISTING)
@ServletComponentScan
public class DevAppStartup extends SpringBootServletInitializer {

    public static void main(String[] args) {
        SpringApplication.run(DevAppStartup.class, args);
    }

}

Here you can find the output of command "mvn dependency:tree" mvn_dependency_tree.txt

user1800752
  • 29
  • 1
  • 6

2 Answers2

0

As the documentation suggests, you need to create a ProducerFactory bean if you want to use your own KafkaTemplate:

@Configuration
public class KafkaProducerConfig {

    @Bean
    public ProducerFactory<String, Object>producerFactory(KafkaProperties kafkaProperties) {
        return new DefaultKafkaProducerFactory<>(kafkaProperties.buildProducerProperties());
    }

    @Bean
    public KafkaTemplate<String, Object> kafkaTemplate(ProducerFactory<String, Object>producerFactory) {
        return new KafkaTemplate<>(producerFactory);
    }
}
Jonatan Ivanov
  • 4,895
  • 2
  • 15
  • 30
0

Based on the document from Spring Sleuth: https://docs.spring.io/spring-cloud-sleuth/docs/current-SNAPSHOT/reference/html/integrations.html#sleuth-kafka-integration

We decorate the Kafka clients (KafkaProducer and KafkaConsumer) to create a span for each event that is produced or consumed. You can disable this feature by setting the value of spring.sleuth.kafka.enabled to false.

You have to register the Producer or Consumer as beans in order for Sleuth’s auto-configuration to decorate them. When you then inject the beans, the expected type must be Producer or Consumer (and NOT e.g. KafkaProducer).

sendon1982
  • 9,982
  • 61
  • 44