I have a Django app that I need to perform a migration on. Here's a representational schema of what I need to modify:
class A(Model):
c = ForeignKey(C)
...
class B(Model):
c = ForeignKey(C)
...
class C(Model):
x = CharField()
y = CharField()
z = CharField()
class Meta:
unique_together = (('x', 'y', 'z'),)
z
is no longer relevant to the uniqueness of the foreign key C
. So I am going to drop the field z
and change the unique requirement to just x
and y
.
But first I need to do a data migration that drops the upcoming duplicate entries, and I need to fix up all of the ForeignKeys to point to that single consolidated entry.
I'm looking if there is a better way than what I'll outline in my answer below. I'm not entirely satisfied with my answer as it doesn't have any safeguards of dropping a foreign key that is still in use -- other than the algorithm itself. (Like, what if there is also another foreign key relationship to C
I overlooked.)
(Django 1.7 & Postgres)