I am experiencing big time and computing power challenges when doing big data migration (several 100.000 rows). I am developing a service that handles a lot of data in rails. Our models are constantly changing as we get more and more cleaver about our design. This leads to a lot of migrations on our database which is a Postgres 9.0 database. Often these migrations also includes some kind of migration on the data itself. Yesterday we found out that we needed to move a 'text' attribute on a model into a separate model so that the attribute was no longer just an attribute on the model but a one to many relationship instead.
My migration looked somewhat like this:
def self.up
create_table :car_descriptions do |t|
t.integer :car_id
t.text :description
t.timestamps
end
Car.find_each do |car|
if car.description.present?
car.descriptions.build :description => car.description
end
car.save
end
remove_column :cars, :description
end
Now the problem is, that this is running pretty slow, and even worse, if I set a counter, and prints out the progress, I can see that the migration is running slower and slower over time. In my activity monitor I can see that the ruby process is taking up more and more memory.
So my question is - is there a better way to do big data migrations like this?