1

I have the following BiConsumer


@Bean
    public BiConsumer<KStream<String, UserEvent>, GlobalKTable<Long, ShopEvent>> process() {
        
            return (userEvent, shopEvent) -> {

             userEvent
                    .leftJoin(shopEvent, (key, value) -> value.getShopId())
                    .filter(filterApplier.isEventValied(userEvent))
                    .foreach((eventId, userEvent) -> {
                        
                         log.info(userEvent.toString())

                    });

        };
    }

I would like to consume the messages in batches. I found the following documentation from apache-kafka-binder which describes some properties like:

spring.cloud.stream.bindings.process-in-0.consumer.batch-mode: true

and

spring.cloud.binders.kstream-binder:
          consumer-properties.max.poll.interval.ms: 1000
          consumer-properties.max.poll.records: 1
          consumer-properties.fetch.max.wait.ms: 1000
          consumer-properties.fetch.min.bytes: 5000

As described in this tutorial you can have something like this using apache-kafka-binder.

@Bean
    public Consumer<List<String>> input() {
        return list -> {
            System.out.println(list);
            throw new RuntimeException("test");
        };
    }

But is there any possibility to implement it the same way but with kafka-streams-binder?

0 Answers0