0

I'm using spring-boot 2.7.4 and spring-cloud-dependencies 2021.0.4.

I haven't found any solution in spring documentation for add trustedTypes in BatchMessagingMessageConverter. I'm using kafka for read messages in batch-mode. If I insert a custom header (my own class) when the consumer read the header return a DefaultKafkaHeaderMapper$NonTrustedHeaderType and not my class.

I have in my configuration this key to activate batch mode:

spring.cloud.stream.bindings.nameBind-in-0.consumer.batch-mode=true

I tried in debug to add to headerMapper in BatchMessagingMessageConverter the package of my class and all works fine. There is a way to specify my package in configuration?

I followed the documentation https://docs.spring.io/spring-cloud-stream/docs/3.2.5/reference/html/spring-cloud-stream-binder-kafka.html#kafka-binder-properties, I created a bean like this:

@Bean("kafkaHeaderMapperCustom")
KafkaHeaderMapper getKafkaHeaderMapperCustom() {
    var defKHM = new DefaultKafkaHeaderMapper();
    defKHM.addTrustedPackages("*");
    return defKHM;
}

Specified to key spring.cloud.stream.kafka.binder.headerMapperBeanName in configuration but doesn't work, I suppose that configuration is valid for not batch context?

I tried also these properties:

spring.kafka.consumer.properties.spring.json.trusted.packages
spring.json.trusted.packages

EDIT - Add example:

import org.springframework.boot.ApplicationRunner;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.cloud.stream.function.StreamBridge;
import org.springframework.context.annotation.Bean;
import org.springframework.kafka.support.DefaultKafkaHeaderMapper;
import org.springframework.kafka.support.KafkaHeaderMapper;
import org.springframework.kafka.support.KafkaHeaders;
import org.springframework.messaging.Message;
import org.springframework.messaging.MessageHeaders;
import org.springframework.messaging.support.MessageBuilder;

import java.time.LocalDate;
import java.util.ArrayList;
import java.util.List;
import java.util.Map;
import java.util.function.Consumer;

@SpringBootApplication
public class Application {

    public static final String HEADER_KEY = "CUSTOM_HEADER_KEY";

    public static void main(String[] args) {
        SpringApplication.run(Application.class, args);
    }

    @Bean
    public ApplicationRunner runner(StreamBridge streamBridge) {
        return args -> {
            var headers = new MessageHeaders(Map.of(HEADER_KEY, new CustomHeaderClass("field1Value", LocalDate.now())));
            headers.get(KafkaHeaders.BATCH_CONVERTED_HEADERS);
            var message = MessageBuilder.createMessage(new ExampleBrokenHeaderEntity("randomValue", "randomName"), headers);
            streamBridge.send("stackoverflow-example", message);
        };
    }

    @Bean
    public Consumer<Message<List<ExampleBrokenHeaderEntity>>> readFromKafkaBatchMode() {
        return messages -> {
            var brokenHeader = ((ArrayList<Map<String, Object>>) messages.getHeaders().get(KafkaHeaders.BATCH_CONVERTED_HEADERS)).get(0).get(HEADER_KEY);
            System.out.println("BATCH - Class header: " + (brokenHeader != null ? brokenHeader.getClass() : null));
        };
    }

    @Bean
    public Consumer<Message<ExampleBrokenHeaderEntity>> readFromKafkaNoBatchMode() {
        return messages -> {
            var brokenHeader = messages.getHeaders().get(HEADER_KEY);
            System.out.println("NO_BATCH - Class header: " + (brokenHeader != null ? brokenHeader.getClass() : null));
        };
    }

    @Bean("kafkaHeaderMapperCustom")
    public KafkaHeaderMapper getKafkaHeaderMapperBatchMode() {
        var kafkaHeaderMapperCustom = new DefaultKafkaHeaderMapper();
        kafkaHeaderMapperCustom.addTrustedPackages("*");
        return kafkaHeaderMapperCustom;
    }
}
import lombok.AllArgsConstructor;
import lombok.Data;
import lombok.NoArgsConstructor;

import java.time.LocalDate;

@Data
@NoArgsConstructor
@AllArgsConstructor
public class CustomHeaderClass {

    private String field1;

    private LocalDate field2;

}
import lombok.AllArgsConstructor;
import lombok.Data;

@Data
@AllArgsConstructor
public final class ExampleBrokenHeaderEntity {

    private String type;

    private String name;

}
spring.cloud.stream.kafka.binder.brokers=x.x.x.x:xxxx

spring.cloud.function.definition=readFromKafkaNoBatchMode;readFromKafkaBatchMode

spring.cloud.stream.bindings.readFromKafkaBatchMode-in-0.destination=stackoverflow-example
spring.cloud.stream.bindings.readFromKafkaBatchMode-in-0.group=readFromKafkaBatchMode
spring.cloud.stream.bindings.readFromKafkaBatchMode-in-0.consumer.batch-mode=true

spring.cloud.stream.bindings.readFromKafkaNoBatchMode-in-0.destination=stackoverflow-example
spring.cloud.stream.bindings.readFromKafkaNoBatchMode-in-0.group=readFromKafkaNoBatchMode

spring.cloud.stream.kafka.binder.headerMapperBeanName=kafkaHeaderMapperCustom

The output of example is:

NO_BATCH - Class header: class com.example.kafka.header.types.CustomHeaderClass
BATCH - Class header: class org.springframework.kafka.support.DefaultKafkaHeaderMapper$NonTrustedHeaderType

1 Answers1

0

It's a bug; the binder only sets the custom header mapper on a record converter:

private MessageConverter getMessageConverter(
        final ExtendedConsumerProperties<KafkaConsumerProperties> extendedConsumerProperties) {

    MessageConverter messageConverter = BindingUtils.getConsumerMessageConverter(getApplicationContext(),
            extendedConsumerProperties, this.configurationProperties);
    if (messageConverter instanceof MessagingMessageConverter) {
        ((MessagingMessageConverter) messageConverter).setHeaderMapper(getHeaderMapper(extendedConsumerProperties));
    }
    return messageConverter;
}

There should be similar code for when the converter is a BatchMessagingMessageConverter.

The work around is to define a custom message converter for the batch consumer:

@Bean("batchConverter")
BatchMessageConverter batchConverter(KafkaHeaderMapper kafkaHeaderMapperCustom) {
    BatchMessagingMessageConverter batchConv = new BatchMessagingMessageConverter();
    batchConv.setHeaderMapper(kafkaHeaderMapperCustom);
    return batchConv;
}
spring.cloud.stream.kafka.bindings.readFromKafkaBatchMode-in-0.consumer.converter-bean-name=batchConverter
NO_BATCH - Class header: class com.example.demo.So74294156Application$CustomHeaderClass
BATCH - Class header: class com.example.demo.So74294156Application$CustomHeaderClass

Please open an issue against Spring Cloud Stream, referencing this question/answer.

Gary Russell
  • 166,535
  • 14
  • 146
  • 179
  • Thanks Gary! I opened this issue https://github.com/spring-cloud/spring-cloud-stream/issues/2550. I promise, next time I'll provide an example committed on GitHub :) – Salvatore Bernardo Nov 03 '22 at 16:47
  • Hi @GaryRussel ! Thanks for your answer. Any idea why your converter doesn't work for me? Bean named 'batchConverter' is expected to be of type 'org.springframework.kafka.support.converter.MessagingMessageConverter' but was actually of type 'org.springframework.kafka.support.converter.BatchMessagingMessageConverter' – Lucas Dec 28 '22 at 10:47
  • spring.cloud.stream.kafka.bindings.my-binding.consumer.converter-bean-name: batchConverter – Lucas Dec 28 '22 at 10:48
  • spring-cloud-stream 3.0.12.RELEASE – Lucas Dec 28 '22 at 10:54
  • It needs `...consumer.batch-mode=true` to use a batch converter; if that's not the problem, ask a new question with config details. 3.0.x is no longer supported. https://spring.io/projects/spring-cloud-stream#support – Gary Russell Dec 28 '22 at 15:08
  • It does have the consumer.batch-mode=true. My StreamListener is receiving an array of messages without a problem, but the headers ("kafka_batchConvertedHeaders") are very messy. – Lucas Dec 29 '22 at 08:25
  • Do you want me to open a new question, @GaryRussel ? – Lucas Dec 29 '22 at 08:26
  • Hey Gary, just opened a new question: https://stackoverflow.com/q/74948993/3551820 – Lucas Dec 29 '22 at 08:54