0

I am trying to copy all the tables from a schema (PostgreSQL, 50+ tables) to Amazon S3.

What is the best way to do this? I am able to create 50 different copy activities, but is there a simple way to copy all tables in a schema or write one pipeline and loop?

John Rotenstein
  • 241,921
  • 22
  • 380
  • 470

2 Answers2

0

I think the old method is :

 1. Unload your data from PostgreSQL to a CSV file first using something like psql  
 2. Then just copy the csv to S3

But, AWS gives u a script to do so , RDSToS3CopyActivity See this link from AWS

Nishant Singh
  • 3,055
  • 11
  • 36
  • 74
-1

Since you have a large number of tables. I would recommend using AWS Glue as compared to AWS Data Pipeline. Glue is easily configurable having crawlers etc that allows you the flexibility to choose columns, define etc. Moreover,he underlying jobs in AWS Glue are pyspark jobs that scale really well giving you really good performance.