4

I have a database with existing tables that are not used by my Python code. I generated a migration using Flask-Migrate and ran it, and it deleted my existing tables while creating the user table. How can I run migrations without removing any existing tables?

I read the answer to the question "Preserve existing tables in database when running Flask-Migrate", but it doesn't work for me because I do not own the database, and I do not know which tables might exist at the time of deployment... Which means I cannot whitelist the tables that should be preserved.

Is there a way to tell Flask-migrate/Alembic not to drop any tables that it doesn't know about?

from flask import Flask
from flask_sqlalchemy import SQLAlchemy
from flask_script import Manager
from flask_migrate import Migrate, MigrateCommand

app = Flask(__name__)
app.config['SQLALCHEMY_DATABASE_URI'] = 'my_data'

db = SQLAlchemy(app)
migrate = Migrate(app, db)

manager = Manager(app)
manager.add_command('db', MigrateCommand)

class User(db.Model):
    id = db.Column(db.Integer, primary_key=True)
    name = db.Column(db.String(128))

if __name__ == '__main__':
    manager.run()
Lindsey1986
  • 381
  • 1
  • 3
  • 15
  • If you have access to delete them, surely you can just query the db for existing tables and whitelist them. – noslenkwah Aug 23 '19 at 18:52
  • I don't have access to delete them. Just have access to a (somewhat similar) copy of a production database that I can use to generate the migrations... Anyway, how can I tell Alembic *in an automated fashion* which tables to ignore? – Lindsey1986 Aug 23 '19 at 20:10

3 Answers3

3

You just need this :

# in env.py

def include_object(object, name, type_, reflected, compare_to):
    if type_ == "table" and reflected and compare_to is None:
        return False
    else:
        return True


context.configure(
    # ...
    include_object = include_object

See here for the documentation : https://alembic.sqlalchemy.org/en/latest/cookbook.html#don-t-generate-any-drop-table-directives-with-autogenerate

Jan G.
  • 31
  • 2
  • I tried this, and still get errors, do you have an example of this working? Where does it need to go in the env.py file? – BLang Jul 01 '21 at 15:54
0

you can use a Rewriter and do an automatic check before deletion. override the ops.DropTableOp operation.

If you want, you can also enter a provision to only drop tables that you do have control over. these will be the ones that inherit from your Base (in case of pure alembic) or db.Model (for flask).

example

from alembic.autogenerate import rewriter
from alembic.operations import ops

writer = rewriter.Rewriter()

@writer.rewrites(ops.DropTableOp)
def add_column(context, revision, op):
    if op.table_name in Base.metadata.tables.keys():  
        return op  # only return an operation when you want
    return []  # we need to return an iterable

Note that you need to pass the writer object to the process_revision_directives kwarg when doing context.configure in your env.py file. (see the doc)

moshevi
  • 4,999
  • 5
  • 33
  • 50
0

in the migrations/env.py file, add this function:

def include_object(object, name, type_, reflected, compare_to):
    if type_ == "table" and reflected and compare_to is None:
        return False
    else:
        return Tru

and modify run_migrations_online function:

def run_migrations_online():
    ...
    with connectable.connect() as connection:
        context.configure(
            connection=connection,
            target_metadata=target_metadata,
            process_revision_directives=process_revision_directives,
            # new line
            include_object=include_object,
            **current_app.extensions['migrate'].configure_args)
    ...
max
  • 187
  • 1
  • 2
  • 9