0

Well. I have docker-compose.yaml with Postgres image (it is simple sample)

And I have NodeJS-script with raw SQL-query to Postgres:

 'COPY (SELECT * FROM mytable) to ‘/var/lib/postgresql/data/mytable.csv‘'

What happening? mytable.csv saved into Postgres container

What I need? Save mytable.csv to HOST MACHINE (or another container from docker-compose)

Anyway, context: I have big tables (1m+ rows) and it necessary to generate and save files by Postgres server. But this process (saving) will start via NodeJS script with "COPY"-query in other container / host machine.

Does you know information about how to do this things?

my docker-compose.yml:

version: "3.6"

services:
  postgres:
    image: postgres:10.4
    environment:
      - POSTGRES_USER=user
      - POSTGRES_PASSWORD=1234
    volumes:
      - postgres-storage:/var/lib/postgresql/data
    ports: 
      - "5432:5432"

UPDATE: I did some graphic in Miro for my process. The main problem in THIRD:I can't return .csv file to NodeJS or save it into NodeJS container. I can do 2 things:

  1. Return rows for forming file in NodeJS (but NodeJS server will do it slowly)
  2. Save .CSV file in Postgres container. But I need a .CSV file into a NodeJS container Schema with two containers that I need
  • Does this answer your question? [Why query won't save in csv file while it's seems normal in postgresql console](https://stackoverflow.com/questions/59283138/why-query-wont-save-in-csv-file-while-its-seems-normal-in-postgresql-console) – Jim Jones Jul 31 '20 at 12:50
  • 1
    Hello @Ivan Ratkin, Welcome to SO! Can you please edit your post and add the `Dockerfile` and `docker-compose.yml`, or at least the relevant extracts of those files? This can help others understand/reproduce the problem you are facing, increasing the changes for this question to receive an answer. Thank you! – Neo Anderson Jul 31 '20 at 13:23

1 Answers1

0

Well, thanks for man, who linked this question with question about COPY TO STDOUT (sorry, didn't remember question ID).

So, problem solved by using COPY TO STDOUT and small npm-module pg-copy-streams

The code:

const client = new Client(config);
await client.connect();
const output = fs.createWriteStream('./output.csv');
const result = client.query(copyTo('COPY (select * from my_table) TO STDOUT WITH (FORMAT CSV, HEADER)'));
result.pipe(output);

So, Postgres will sent csv file in stream to host NodeJS script, and on the NodeJS-side we need to only write this stream in file without converting to csv.

Thanks!