Well. I have docker-compose.yaml with Postgres image (it is simple sample)
And I have NodeJS-script with raw SQL-query to Postgres:
'COPY (SELECT * FROM mytable) to ‘/var/lib/postgresql/data/mytable.csv‘'
What happening? mytable.csv saved into Postgres container
What I need? Save mytable.csv to HOST MACHINE (or another container from docker-compose)
Anyway, context: I have big tables (1m+ rows) and it necessary to generate and save files by Postgres server. But this process (saving) will start via NodeJS script with "COPY"-query in other container / host machine.
Does you know information about how to do this things?
my docker-compose.yml:
version: "3.6"
services:
postgres:
image: postgres:10.4
environment:
- POSTGRES_USER=user
- POSTGRES_PASSWORD=1234
volumes:
- postgres-storage:/var/lib/postgresql/data
ports:
- "5432:5432"
UPDATE: I did some graphic in Miro for my process. The main problem in THIRD:I can't return .csv file to NodeJS or save it into NodeJS container. I can do 2 things:
- Return rows for forming file in NodeJS (but NodeJS server will do it slowly)
- Save .CSV file in Postgres container. But I need a .CSV file into a NodeJS container Schema with two containers that I need