I have a Django project with 5 different PostgreSQL databases. The project was preemptively separated in terms of model routing, but has proven quite problematic so now I'm trying to reverse it. Unfortunately, there's some overlap of empty, migrated tables so pg_dump's out of the question. It looks like django-dumpdb may suit my needs but it doesn't handle per-database export/import. Additionally, Django's dumpdata/loaddata are installing 0 of the records from generated fixtures. Can I have some suggestions as to the least painful way to merge the data?
Asked
Active
Viewed 779 times
2 Answers
3
I had a similar issue, two identical websites sharing several models but with different objects. Unfortunately, merging them by natural_key was impossible.
I had to implement a new django commands similar to loaddata, in order to append all models from the second website into the first one.
You can find the code of the solution in this gist:
https://gist.github.com/MattFanto/f6c0ee0bc392da1d0d90f28efdb77e40

Mattia Fantoni
- 889
- 11
- 15
1
there's always the dump data from django, which is pretty easy to use.
or you could do this manually:
- if the 2 databases share the same data (they are mirror one to another) and the same table structure, you could just run a syncdb from django to create the new table structure and then dump and import (i'm assuming you're using mysql, but the general idea is the same) the old database into the new one
- if the two databases share different data (still with the same structure) you should import every single row of the two databases: this way, you'll keep relations etc, but you'll have your unique id updated to the new sole db.
- if the two databases are different in both data and structure, you'll have to run two sincdb and two imports, but this doesn't seem to be your case

Forhadul Islam
- 1,159
- 11
- 13
-
Unfortunately when I did this none of the fixtures were being installed into the target database and Django wasn't giving me any errors. – Joshua Steiner Apr 07 '17 at 18:02