1

The main purpose is to read a stream from a topic, apply some transformations and then send two events to other topics. For that we are using Kstream.branch() function and using functional style programming. The code is:

Input POJO:

@Data
@NoArgsConstructor
@AllArgsConstructor
@JsonInclude(JsonInclude.Include.NON_NULL)
@JsonIgnoreProperties(ignoreUnknown = true)
public class FooInput {

    @JsonProperty("field1")
    private String field1;

    @JsonProperty("field2")
    private String field2;
}

Output POJO:

@Getter
@Setter
@ToString
@EqualsAndHashCode
public class FooEvent<T> extends EventInfo {

    @JsonProperty(value = "entity")
    private T entity;

    @Builder
    private FooEvent(T entity, String eventId, OffsetDateTime eventTime, Action eventAction, String eventSourceSystem, String eventEntityName) {
        super(eventId, eventTime, eventAction, eventSourceSystem, eventEntityName);
        this.entity = entity;
    }

    public FooEvent() {
        super();
    }

}
@Setter
@Getter
@ToString
@AllArgsConstructor
@NoArgsConstructor
public abstract class EventInfo {

    @JsonProperty(value = "eventId")
    private String eventId;

    @JsonProperty(value = "eventTime")
    private OffsetDateTime eventTime;

    @JsonProperty(value = "eventAction")
    private Action eventAction;

    @JsonProperty(value = "eventSourceSystem")
    private String eventSourceSystem;

    @JsonProperty(value = "eventEntityName")
    private String eventEntityName;
}
@Data
@NoArgsConstructor
@AllArgsConstructor
@JsonInclude(JsonInclude.Include.NON_NULL)
@JsonIgnoreProperties(ignoreUnknown = true)
public class Bar {

    @JsonProperty("field1")
    private String field1;

    @JsonProperty("field2")
    private String field2;

    @JsonProperty("field3")
    private String field3;
}

Processor function:

    @Bean
    public Function<KStream<String, FooInput>, KStream<String, FooEvent<Bar>>[]> process() {

        Predicate<String, FooEvent<Bar>> predicate1=
            (key, value) -> value.getEntity().getField1().equalsIgnoreCase("test1");
        Predicate<String, FooEvent<Bar>> predicate2=
            (key, value) -> value.getEntity().getField1().equalsIgnoreCase("test2");

        return input -> {
            input
                ...
                .branch(predicate1, predicate2);
        };
   }

The binds are declared in appplication.properties:

Input:

spring.cloud.stream.bindings.process-in-0.destination=topic0
spring.cloud.stream.bindings.process-in-0.content-type=application/json

Output:

spring.cloud.stream.bindings.process-out-0.destination=topic1
spring.cloud.stream.bindings.process-out-0.content-type=application/json

spring.cloud.stream.bindings.process-out-1.destination=topic2
spring.cloud.stream.bindings.process-out-1.content-type=application/json

The issue is when the application evaluates the predicate. It appears that it tries to convert to FooEvent<Bar>. It converts the eventId, eventTime, eventAction, ... fields just fine but when it comes to the entity field (in this case Bar) it stores the values on a HashMap (instead of creating a new Bar object and setting the proper fields) which leads me to believe that Spring default Serde (JsonSerde) is doing something wrong. Any suggestions on how to solve generic types Serde problem in Kafka Streams?

0 Answers0