Kafka Streams processor api processes a single record at a time initially, but you could use state stores with Transformer
for stateful operations for achieving batching.
I guess you need batching by both size and time (e.g. if during 5 seconds batch size not reached, propagate data what we already collected). If so, please take a look at example How to process a KStream in a batch of max size or fallback to a time window. If you need time-based aggregation (and not by size), please take a look at windowing.
Actually, you don't need some specific properties for using processor api (only regular props for Kafka Streams like bootstrap.servers
, application.id
, auto.offset.reset
etc.), and only for batching you need to declare a state store.