4

I've been developing a django app with south for a while, and doing a sort of loose continuous deployment. Very shortly after my initial migration, I did a couple data migrations that looked like this:

def forwards(self, orm):                                                   
    from django.core.management import call_command                        
    call_command("loaddata", "#######.json")    

At the time, I didn't think anything of it. It had been easy enough to populate the database manually and then dump it all into a fixture. Then when I finally wrote some unit tests, I started getting errors like this:

Creating test database for alias 'default'...
Problem installing fixture '/home/axel/Workspace/02_ereader_blast/content/fixtures/99_deals.json': Traceback (most recent call last):
  File "/home/axel/Workspace/02_ereader_blast/venv/local/lib/python2.7/site-packages/django/core/management/commands/loaddata.py", line 196, in handle
    obj.save(using=using)
  File "/home/axel/Workspace/02_ereader_blast/venv/local/lib/python2.7/site-packages/django/core/serializers/base.py", line 165, in save
    models.Model.save_base(self.object, using=using, raw=True)
  File "/home/axel/Workspace/02_ereader_blast/venv/local/lib/python2.7/site-packages/django/db/models/base.py", line 551, in save_base
    result = manager._insert([self], fields=fields, return_id=update_pk, using=using, raw=raw)
  File "/home/axel/Workspace/02_ereader_blast/venv/local/lib/python2.7/site-packages/django/db/models/manager.py", line 203, in _insert
    return insert_query(self.model, objs, fields, **kwargs)
  File "/home/axel/Workspace/02_ereader_blast/venv/local/lib/python2.7/site-packages/django/db/models/query.py", line 1593, in insert_query
    return query.get_compiler(using=using).execute_sql(return_id)
  File "/home/axel/Workspace/02_ereader_blast/venv/local/lib/python2.7/site-packages/django/db/models/sql/compiler.py", line 910, in execute_sql
    cursor.execute(sql, params)
  File "/home/axel/Workspace/02_ereader_blast/venv/local/lib/python2.7/site-packages/django/db/backends/postgresql_psycopg2/base.py", line 52, in execute
    return self.cursor.execute(query, args)
DatabaseError: Could not load content.BookDeal(pk=1): column "entry_id" of relation "content_bookdeal" does not exist
LINE 1: INSERT INTO "content_bookdeal" ("id", "book_id", "entry_id",...
                                                         ^


Installed 19 object(s) from 1 fixture(s)
Problem installing fixture '/home/axel/Workspace/02_ereader_blast/content/fixtures/99_deals_entries.json': Traceback (most recent call last):
  File "/home/axel/Workspace/02_ereader_blast/venv/local/lib/python2.7/site-packages/django/core/management/commands/loaddata.py", line 190, in handle
    for obj in objects:
  File "/home/axel/Workspace/02_ereader_blast/venv/local/lib/python2.7/site-packages/django/core/serializers/json.py", line 47, in Deserializer
    raise DeserializationError(e)
DeserializationError: Entry has no field named 'book_deals'

As far as I can tell, the loaddata command is using my most recent models, rather than the state of south at the time, and because I have changed them significantly since then, the current models are interpreting the old data as invalid.

So my questions are:

  • What is the best way to set up future data migrations so that this doesn't happen?
  • How can I backpedal out of this situation, and take it to best-practices land?
Brian Tompsett - 汤莱恩
  • 5,753
  • 72
  • 57
  • 129
Axel Magnuson
  • 1,192
  • 1
  • 10
  • 26

2 Answers2

2

I found the solution, courtesy of this stackoverflow question:

django loading data from fixture after backward migration / loaddata is using model schema not database schema

As noted in the top answer, I used a snippet that very elegantly patches where the loaddata command gets its model.

Note that I did have to broaden my freezes for those data migrations, so that they could access all the models they needed from the orm rather than directly.

This feels like the right way to solve the problem.

Community
  • 1
  • 1
Axel Magnuson
  • 1,192
  • 1
  • 10
  • 26
0

My approach is a highly likely to be a complete hack / abuse of south / worst kind of practice, I think. However... if you know that your django models conform to your data tables. I might go for a fresh start approach:

  1. rename (or delete) the relevant migrations folders.
  2. in the south_migrationhistory data table, remove all the associated entries for the apps you are trying to create a fresh start for.
  3. python manage.py convert_to_south app-name.
  4. everything should be good to go.

If the data tables are canonical, and your django models are out of line, the way I conform my django models to my data tables is to run:

python manage.py inspectdb > inspectdb.py

Now I can compare the two versions of the django code for my models to get them in alignment. This enables me to go through the fresh start sequence.

Cole
  • 2,489
  • 1
  • 29
  • 48
  • The problem is that my fixtures fail in a fresh start approach, because they conform to a data model that is no longer valid. All of the schema migrations work fine, and if I am able to load those fixtures using the canonical schema, then future data migrations will get them up to date. – Axel Magnuson Feb 22 '13 at 21:13
  • @Axel Some of my data tables are created elsewhere, and then I create a model (often from the inspected.py file I create), so I don't use fixtures to load data. Here is something from django on fixtures and importing data: https://docs.djangoproject.com/en/dev/howto/initial-data/. You are probably much deeper down this hole than I'm likely to get. – Cole Feb 22 '13 at 21:46