I'd like to use Docker to load a .sql file into a PostgreSQL database and serve it.
The following command returns quickly (before the database has been fully populated - which appears to happen asynchronously in the background).
docker run \
--name myContainer \
-p 5432:5432 \
-v mySqlFile.sql:/docker-entrypoint-initdb.d/dump.sql \
-d postgres:alpine
Since this command will be part of a script, I'd like the command to block until the database has fully initialised with mySqlFile.sql
How would I go about getting the command to block, or some other means of pausing while mySqlFile.sql
is loaded?
For clarity: I run the above command, and then the following command:
echo "select count(*) as \"Posts imported\" from posts;" | docker exec -i myContainer psql -U postgres
Posts imported
----------------
0
(1 row)
Waiting a few seconds, and I get this:
echo "select count(*) as \"Posts imported\" from posts;" | docker exec -i myContainer psql -U postgres
Posts imported
----------------
51103
(1 row)
This is a problem, because I am looking to build an automated solution. I want the docker run command to block until all records have been imported into Postgres. Then I have the database in a state which is ready for the next command.