I use python and apache beam to read streaming data from kafka and insert the data to big query table but I want to insert the data in batches instead of streaming way.
I tried to set pipeline streaming mode to True and add batch size to WriteToBigQuery
method but the data was inserted into bq table in streaming mode. Also, I tried to set pipeline streaming mode to False but in the Kafka topic there is too much data to read and the pipeline got stuck. Is there any way to do this?