WARNING: Changing primary key type on a large table is slow. Plan for it, if you can.
This question wasn't particularly popular... comment "just do the work" and voting to close. I'll share what I learned anyways, hopefully it's helpful to someone.
OK, lets "just do work":
A hint to what are integer primary keys and what are bigints, are in the schema.rb
- Identify integer primary keys
#schema.rb
create_table "users", id: :serial, force: :cascade do |t|
Hints for integer primary keys are id: :serial
, ref this SO answer: https://stackoverflow.com/a/54683428/252799
How large are your primary keys?
#Rails console
ActiveRecord::Base.connection.execute('SELECT MAX(id) FROM comments').first
{"max"=>91718106}
Max allowed value is 2 billion, 2 147 483 647 to be exact.
- Create a migration
rails g migration change_primary_key_to_bigint
- Add the changes to primary keys
class ChangePrimaryKeyToBigint < ActiveRecord::Migration[6.0]
def up
change_column :users, :id, :bigint
- Find all references for each primary key you change
change_column :users, :id, :bigint
change_column :user_companies, :user_id, :bigint
change_column :posts, :author_id, :bigint
Be thorough!
Search your schema.rb for all _id
and carefully assess each.
You probably have got several references, join tables, custom reference naming or even polymorfic associations the you may overlook.
- Run it
And wait. For hours depending on the size of the tables.
If zero downtime is important to you, this post looks promising: https://buildkite.com/blog/avoiding-integer-overflows-with-zero-downtime