-1

I have a Django app that I need to perform a migration on. Here's a representational schema of what I need to modify:

class A(Model):
    c = ForeignKey(C)
    ...

class B(Model):
    c = ForeignKey(C)
    ...

class C(Model):
    x = CharField()
    y = CharField()
    z = CharField()
    class Meta:
        unique_together = (('x', 'y', 'z'),)

z is no longer relevant to the uniqueness of the foreign key C. So I am going to drop the field z and change the unique requirement to just x and y.

But first I need to do a data migration that drops the upcoming duplicate entries, and I need to fix up all of the ForeignKeys to point to that single consolidated entry.

I'm looking if there is a better way than what I'll outline in my answer below. I'm not entirely satisfied with my answer as it doesn't have any safeguards of dropping a foreign key that is still in use -- other than the algorithm itself. (Like, what if there is also another foreign key relationship to C I overlooked.)

(Django 1.7 & Postgres)

rrauenza
  • 6,285
  • 4
  • 32
  • 57

1 Answers1

0

Here's the steps that seem the most straight forward:

  • Make a migration to drop z and the unique_together constraint.
  • Make a data migration
    • For each A object find the c object with the smallest id and reassign A.c to that object.
    • For each B object find the c object with the smallest id and reassign B.c to that object.
    • Get a list of all C's and a list of all C's referenced by A and B, subtract the sets and delete() the remainder.
  • Make a migration to reinstate the unique_together.
rrauenza
  • 6,285
  • 4
  • 32
  • 57