0

I have a django(postgreSQL) site in multple environments Local, staging, production

I've tried using the manage.py dumpdata > db.json script to export the db dump from staging, and load the json file to local site with manage.py loaddata db.json. It seems to work, however, the old blog posts that was in the local site still persist after I loaded the datadump.

  • Should I empty my tables before loading in new data? What's the are the practices?
  • Is there any django package out there that handles this nicely?
PatrickC
  • 101
  • 7
  • Possible duplicate [here](https://stackoverflow.com/questions/31851273/how-to-migrate-an-existing-django-project-to-a-new-and-empty-database-or-schema): "If this is your first release then you can drop the whole database, remove all the migration files and then run `python manage.py makemigrations` for a clean database and migration theme. Otherwise, probably you have to edit migration files and remove the table trace from all." – felipe Nov 04 '19 at 21:12
  • Looking to have this as a on-going workflow as the content of the site in the staging/local environment changes – PatrickC Nov 04 '19 at 21:24
  • Ah, understood. So fundamentally, are you trying to use one database for two Django project? I'm a bit confused about the question. – felipe Nov 04 '19 at 22:23
  • Hey Felipe, there is a production and a staging version of the same site. I want to create blog posts in staging and push that to production when it's good to go. – PatrickC Nov 07 '19 at 22:16

0 Answers0