I am currently having a Apache beam pipeline that writes Pub/Sub message to Bigquery and GCS in real-time and my next goal was to pull the messages from Pub/Sub at an interval of every 5 minute and collectively perform analysis on it those multiple 5 minute windowed data that has been collected after an hour of collection, so technically 12 windowed data per hour has to be collected and perform analysis altogether on those 12 windowed data after an hour and write them to my desired sinks. How can I achieve it?
I hope I've made myself clear, please ask for more questions if required. Thanking in advance!