I'm currently using Airflow PostgresToGoogleCloudStorageOperator
and GoogleCloudStorageToBigQueryOperator
to export every table of my Postgres DB (hosted on AWS RDS) to BigQuery. It works but I have 75 tables, so Airflow creates 75 * 2 jobs. Since I'm new to Airflow, I don't know if this is a good practice.
In any case, I'd like to find a way to export all tables at once (pg_dump?) to GCS and then import them into BigQuery.