6

I try to do a table data dump using pg_dump, something like this:

pg96\bin\pg_dump ... --format plain --section data --column-inserts --file obj.account.backup --table obj.account database_xyz

Instead of getting

INSERT INTO obj.account(name, email, password) VALUES ('user1','email1','password1');
INSERT INTO obj.account(name, email, password) VALUES ('user2','email2','password2');

I would like to get

INSERT INTO obj.account (name, email, password) VALUES 
('user1','email1','password1'),
('user2','email2','password2');                                                  

Is there a way for this without any Non-PostgreSQL postprocessing?

Rainer
  • 1,067
  • 2
  • 14
  • 29
  • using `COPY` is not an option?.. – Vao Tsun Jun 11 '17 at 13:43
  • No, so far I know, with copy I get a csv or tsv file. It's not what I want. Sure I could edit it afterwards, but that's what I would like to avoid. – Rainer Jun 11 '17 at 13:48
  • COPY has binary format, and you can re-load the data directly into the table (also using COPY). No need for post processing. See `COPY TO` and `COPY FROM`. – pbuck Jun 11 '17 at 16:31
  • Right, but this format is not useful if you would like to edit this data frequently in a text oriented programming IDE, especially if you have more then 10 columns. I'm not looking for workarounds .. if it is not possible no problem, then I know enough workarounds. The question was simply: is it possible or not? If NOT, it's ok. – Rainer Jun 11 '17 at 20:06

2 Answers2

3

There is no way to get INSERT statements like that with pg_dump.

Laurenz Albe
  • 209,280
  • 17
  • 206
  • 263
1

Since PostgreSQL 12 you can use pg_dump with --rows-per-insert=nrows, see https://www.postgresql.org/docs/12/app-pgdump.html

I'm aware that this is an old question but I wanted to mention it in case somebody else (like me) finds this while searching for a solution. There are cases where COPY can't be used and for bigger data sets using a single INSERT statement is much faster when importing.