We have a table in GCP Bigquery which gets updated automatically every hour . There is a new requirement to move this table (its around 7mb , 44k records) and subsequently then to append its delta every 12 hours to an on-prem sftp server.
We have thought of multi-step approach
- Moving data from bq to gcs
- and then push the data from gcs to the on-prem server using some service like VPN or interconnect.
We are trying to use dataflow to move data from bq to gcs , but it doesnt seem to be a great fit. Does dataflow require custom template to move data from bq to gcs ?The current classic templates doesnt seem to support this in dataflow. Please do let me know if there is any additional information that I can share