I try to prepare our application for the next Spring Cloud Stream release. (currently using 3.0.0.RC1). Using the Kafka Binder.
Right now we receive one message, process it and resend it to another topic. Handling each message separately results in a lot of single requests to our database.
With the 3.0.0 release we want to process messages as batch, so we can save data in a batch update.
In the current version we use @EnableBinding, @StreamListener
@StreamListener( ExchangeableItemProcessor.STOCK_INPUT )
public void processExchangeableStocks( final ExchangeableStock item ) {
publishItems( exchangeableItemProcessor.stocks(), articleService.updateStockInformation( Collections.singletonList( item ) ) );
}
void publishItems( final MessageChannel messageChannel, final List<? extends ExchangeableItem> item ) {
for ( final ExchangeableItem exchangeableItem : item ) {
final Message<ExchangeableItem> message = MessageBuilder.withPayload( exchangeableItem )
.setHeader( "partitionKey", exchangeableItem.getId() )
.build();
messageChannel.send( message )
}
}
I've set the consumer properties to "batch-mode" and changed the signature to List<>
, but doing so results in receiving a List<byte[]>
instead of the expected List<ExchangeableStock>
.
Ofc it's possible to do the conversion afterwards, but that feels like "meh", I think thats something that should happen before the Listener is called.
Then I tried the (new) functional version, and consuming works fine. I also like this simple version of processing
@Bean
public Function<List<ExchangeableStock>, List<ExchangeableStock>> stocks() {
return articleService::updateStockInformation;
}
But the output topic now receives a list of objects as one message, and following consumers are not working correctly.
I think I missed something...
Do I need to add some kind of MessageConverter (for the annotation driven version) or is there a way to achieve the desired behavior with the functional version, too?