0

I'm currently working on a python backend using Flask, SQLAlchemy and flask-migrate/alembic. When I run flask db upgrade head (after performing init and migrate), I get the following error for the 'spatial_ref_sys' table of PostGIS: qlalchemy.exc.InternalError: (psycopg2.errors.DependentObjectsStillExist) cannot drop table spatial_ref_sys because extension postgis requires it HINT: You can drop extension postgis instead.

To exclude the spatial_ref_sys table, I modified the env.py file as follows:

def include_object(object, name, type_, reflected, compare_to):    
    if type_ == "table" and name == 'spatial_ref_sys':
        return False
    else:
        return True


def run_migrations_offline():
    """Run migrations in 'offline' mode.

    This configures the context with just a URL
    and not an Engine, though an Engine is acceptable
    here as well.  By skipping the Engine creation
    we don't even need a DBAPI to be available.

    Calls to context.execute() here emit the given string to the
    script output.

    """
    url = config.get_main_option("sqlalchemy.url")
    context.configure(
        url=url, target_metadata=get_metadata(), literal_binds=True, include_object=include_object
    )

    with context.begin_transaction():
        context.run_migrations()


def run_migrations_online():
    """Run migrations in 'online' mode.

    In this scenario we need to create an Engine
    and associate a connection with the context.

    """

    # this callback is used to prevent an auto-migration from being generated
    # when there are no changes to the schema
    # reference: http://alembic.zzzcomputing.com/en/latest/cookbook.html
    def process_revision_directives(context, revision, directives):
        if getattr(config.cmd_opts, 'autogenerate', False):
            script = directives[0]
            if script.upgrade_ops.is_empty():
                directives[:] = []
                logger.info('No changes in schema detected.')

    connectable = get_engine()

    with connectable.connect() as connection:
        context.configure(
            connection=connection,
            target_metadata=get_metadata(),
            process_revision_directives=process_revision_directives,
            **current_app.extensions['migrate'].configure_args,
            include_object=include_object
        )

        with context.begin_transaction():
            context.run_migrations()


if context.is_offline_mode():
    run_migrations_offline()
else:
    run_migrations_online()

Unfortunately the modifications do not have any effect. Indeed, it seems as the added 'include_object' function is not executed at all. I compared it to the documentation (https://alembic.sqlalchemy.org/en/latest/autogenerate.html) but could not identify any mistake. Do I miss anything?

Scopuli
  • 1
  • 2

1 Answers1

0

The include_object option that you added is likely correct, but keep in mind that this option is used when the migration script is generated, not when the database is upgraded.

To fix the migration that is already generated, what you should do is hand edit it to delete the code that upsets your PostGIS. Going forward, the include_object statement will prevent new migration scripts from attempting to delete this table.

Miguel Grinberg
  • 65,299
  • 14
  • 133
  • 152