I've been developing a django app with south for a while, and doing a sort of loose continuous deployment. Very shortly after my initial migration, I did a couple data migrations that looked like this:
def forwards(self, orm):
from django.core.management import call_command
call_command("loaddata", "#######.json")
At the time, I didn't think anything of it. It had been easy enough to populate the database manually and then dump it all into a fixture. Then when I finally wrote some unit tests, I started getting errors like this:
Creating test database for alias 'default'...
Problem installing fixture '/home/axel/Workspace/02_ereader_blast/content/fixtures/99_deals.json': Traceback (most recent call last):
File "/home/axel/Workspace/02_ereader_blast/venv/local/lib/python2.7/site-packages/django/core/management/commands/loaddata.py", line 196, in handle
obj.save(using=using)
File "/home/axel/Workspace/02_ereader_blast/venv/local/lib/python2.7/site-packages/django/core/serializers/base.py", line 165, in save
models.Model.save_base(self.object, using=using, raw=True)
File "/home/axel/Workspace/02_ereader_blast/venv/local/lib/python2.7/site-packages/django/db/models/base.py", line 551, in save_base
result = manager._insert([self], fields=fields, return_id=update_pk, using=using, raw=raw)
File "/home/axel/Workspace/02_ereader_blast/venv/local/lib/python2.7/site-packages/django/db/models/manager.py", line 203, in _insert
return insert_query(self.model, objs, fields, **kwargs)
File "/home/axel/Workspace/02_ereader_blast/venv/local/lib/python2.7/site-packages/django/db/models/query.py", line 1593, in insert_query
return query.get_compiler(using=using).execute_sql(return_id)
File "/home/axel/Workspace/02_ereader_blast/venv/local/lib/python2.7/site-packages/django/db/models/sql/compiler.py", line 910, in execute_sql
cursor.execute(sql, params)
File "/home/axel/Workspace/02_ereader_blast/venv/local/lib/python2.7/site-packages/django/db/backends/postgresql_psycopg2/base.py", line 52, in execute
return self.cursor.execute(query, args)
DatabaseError: Could not load content.BookDeal(pk=1): column "entry_id" of relation "content_bookdeal" does not exist
LINE 1: INSERT INTO "content_bookdeal" ("id", "book_id", "entry_id",...
^
Installed 19 object(s) from 1 fixture(s)
Problem installing fixture '/home/axel/Workspace/02_ereader_blast/content/fixtures/99_deals_entries.json': Traceback (most recent call last):
File "/home/axel/Workspace/02_ereader_blast/venv/local/lib/python2.7/site-packages/django/core/management/commands/loaddata.py", line 190, in handle
for obj in objects:
File "/home/axel/Workspace/02_ereader_blast/venv/local/lib/python2.7/site-packages/django/core/serializers/json.py", line 47, in Deserializer
raise DeserializationError(e)
DeserializationError: Entry has no field named 'book_deals'
As far as I can tell, the loaddata command is using my most recent models, rather than the state of south at the time, and because I have changed them significantly since then, the current models are interpreting the old data as invalid.
So my questions are:
- What is the best way to set up future data migrations so that this doesn't happen?
- How can I backpedal out of this situation, and take it to best-practices land?