Questions tagged [pg-dump]

10 questions
10
votes
1 answer

Postgresql 13 - Speed up pg_dump to 5 minutes instead of 70 minutes

We use pg_dump nightly to make a snapshot of our database. We did for a long time with a simple command pg_dump -Fc database_name This takes about an hour and produces a file of 30+GByte. How can we speed up things?
Janning
  • 1,421
  • 2
  • 21
  • 38
5
votes
1 answer

Using pg_dump when connecting via a service

I connect to my postgres server using psql "service=subscription". How do I use the service with pg_dump? I tried to do the following: pg_dump -h "service=subscription" > /home/fogest/dump.out This however did not work. How should I be doing…
ComputerLocus
  • 193
  • 4
  • 12
4
votes
1 answer

pg_dump with -w flag does not read from .pgpass file

I am trying to backup my PostgreSQL database called crewdb on Ubuntu 18.04 LTS from a script with the following command in it: pg_dump -h localhost -p 5432 -U postgres -w -C -F p -b -v -f ~/Dropbox\/postgres_backup/crewdb.backup.sql crewdb I know…
Christian Hick
  • 145
  • 2
  • 7
1
vote
1 answer

docker postgres backup creates an empty file when run as a cron job

I'm trying to create a cronjob that creates db backups every night. My crontab has the job: * * * * * /home/user/scripts/backup.sh (have it set to go off every min for testing) In backup.sh, I have: docker exec -it dbContainer pg_dump -U username -d…
1
vote
1 answer

pg_dump complains PostgreSQL version 11 is not installed after installing postgres-client-13

We just upgraded our PostgreSQL servers to v13. We heavily use pg_dump on Ubuntu 18 systems to transfer data between databases. After upgrading the servers, pg_dump would complain about a version mismatch. Easy enough, I installed the…
Christian
  • 81
  • 6
0
votes
1 answer

Postgres pg_dump using compression - Keeping schema only for some tables

I am currently taking nightly exports of a Postgres 10.5 database, but taking schema only for 2 very large tables that I don't need the data for. I am compressing the export when done, and would really like to use pg_dump's internal compression…
emmdee
  • 2,187
  • 12
  • 36
  • 60
0
votes
0 answers

Replicate AWS RDS Postgres instances

Faced a situation we need to consolidate few small postgres instances to a bigger one Cant figure out how to replicate "old" db new data to "new" db when the replacement happens Ill try to simple it : old db instance name is X new db instance is…
0
votes
0 answers

Not able to pg_dump due to permission, despite having root permissions

I am trying to run this command sudo pg_dump -U bm_clients -Z 9 -v baydb |aws s3 cp - s3://thebay.com/bay.dump.gz The output is as follows: pg_dump: reading extensions pg_dump: identifying extension members pg_dump: reading schemas pg_dump: reading…
shubhendu
  • 101
  • 1
  • 3
0
votes
1 answer

how to run a bash backup script with simultaneous processes without getting stuck with infinite looping?

I wrote this script for run postgresql pg_dump of multiples schemas. Since it takes much time to finishes all i'd improved it by running simultaneous dumps with this: for SCHEMA in $SCH do while [ $(jobs | wc -l | xargs) -ge $PROC ]; do sleep 5;…
0
votes
1 answer

Will the DB dump be consistent if a table is being inserted each second?

I want to create a consistent backup (dump) of a PostgreSQL database using sudo -u postgres pg_dump database_name > db_backup.sql. There some multiple tables that are being inserted a row each second, therefore these tables are quite huge. I have…
tukusejssirs
  • 103
  • 4