1

I am faced with a situation where we get a lot of CSV files from different clients but there is always some issue with column count and column length that out target table is expecting.

What is the best way to handle frequently changing CSV files. My goal is load these CSV files into Postgres database.

I checked the \COPY command in Postgres but it does have an option to create a table.

Stu
  • 249
  • 2
  • 6
  • 17
  • Just to clarify: Are you wanting to load all the columns in the CSV files (creating new tables as required) or just those columns that match your current Postgresql table structure? – gsiems Feb 14 '14 at 23:03
  • did you check http://stackoverflow.com/questions/21018256/can-i-automatically-create-a-table-in-postgresql-from-a-csv-file-with-headers ? – Andrew Wolfe Jun 13 '16 at 15:23

2 Answers2

0

You could try creating a pg_dump compatible file instead which has the appropriate "create table" section and use that to load your data instead.

gsiems
  • 3,500
  • 1
  • 22
  • 24
0

I recommend using an external ETL tool like CloverETL, Talend Studio, or Pentaho Kettle for data loading when you're having to massage different kinds of data.

\copy is really intended for importing well-formed data in a known structure.

Craig Ringer
  • 307,061
  • 76
  • 688
  • 778