1

I'm at my wits end here. I'm trying to create an initial migration to my Postgres database using alembic. I'm autogenerating the migration from my SQLAlchemy ORM models.

Here's my models.py file:

from typing import List, Optional

from sqlalchemy import Boolean, ForeignKey, String
from sqlalchemy.orm import Mapped, mapped_column, relationship

from myapp.database.core import Base, TimestampMixin

from .choices import RoleChoices


class Company(TimestampMixin, Base):
    __tablename__ = "companies"

    id: Mapped[int] = mapped_column(primary_key=True)
    name: Mapped[str]
    domain: Mapped[str] = mapped_column(String(128), unique=True)
    is_active: Mapped[bool] = mapped_column(Boolean, default=True)
    users: Mapped[List["User"]] = relationship(
        back_populates="company", cascade="all, delete-orphan", passive_deletes=True
    )

    def __repr__(self) -> str:
        return f"""
        <Company(
            id={self.id},
            name={self.name},
            domain={self.domain},
            is_active={self.id})>"""


class User(TimestampMixin, Base):
    __tablename__ = "users"

    id: Mapped[int] = mapped_column(primary_key=True)
    email: Mapped[str] = mapped_column(String(128), unique=True)
    username: Mapped[Optional[str]] = mapped_column(String(128))
    is_active: Mapped[bool] = mapped_column(Boolean, default=True)
    role: Mapped[RoleChoices]
    company_id: Mapped[int] = mapped_column(
        ForeignKey("companies.id", ondelete="CASCADE")
    )
    company: Mapped["Company"] = relationship(back_populates="users")

    def __repr__(self) -> str:
        return f"<User(id={self.id}, email={self.email}, username={self.username})>"

My base model is:

from datetime import datetime

from sqlalchemy import DateTime
from sqlalchemy.orm import DeclarativeBase, Mapped, mapped_column


class Base(DeclarativeBase):
    pass


class TimestampMixin:
    created_at: Mapped[DateTime] = mapped_column(
        DateTime, nullable=False, default=datetime.utcnow
    )
    updated_at: Mapped[DateTime] = mapped_column(DateTime, nullable=True)

My alembic.ini file is the autogenerated file when running alembic init migrations, so I haven't edited at all.

My env.py file inside the migrations folder looks like this:

import os
from logging.config import fileConfig

from alembic import context
from dotenv import find_dotenv, load_dotenv
from sqlalchemy import engine_from_config, pool

from myapp.auth.models import Company, User  # noqa
from myapp.database.core import Base

load_dotenv(find_dotenv())

DB_USER = os.environ.get("DB_USER")
DB_NAME = os.environ.get("DB_NAME")
DB_PASSWORD = os.environ.get("DB_PASSWORD")
DB_HOST = os.environ.get("DB_HOST")
DB_PORT = os.environ.get("DB_PORT")

connection_string = (
    f"postgresql+psycopg2://{DB_USER}:{DB_PASSWORD}@{DB_HOST}:{DB_PORT}/{DB_NAME}"
)

# this is the Alembic Config object, which provides
# access to the values within the .ini file in use.
config = context.config
config.set_main_option("sqlalchemy.url", connection_string)

# Interpret the config file for Python logging.
# This line sets up loggers basically.
if config.config_file_name is not None:
    fileConfig(config.config_file_name)

# add your model's MetaData object here
# for 'autogenerate' support
# from myapp import mymodel
# target_metadata = mymodel.Base.metadata
target_metadata = Base.metadata

# other values from the config, defined by the needs of env.py,
# can be acquired:
# my_important_option = config.get_main_option("my_important_option")
# ... etc.


def run_migrations_offline() -> None:
    """Run migrations in 'offline' mode.

    This configures the context with just a URL
    and not an Engine, though an Engine is acceptable
    here as well.  By skipping the Engine creation
    we don't even need a DBAPI to be available.

    Calls to context.execute() here emit the given string to the
    script output.

    """
    url = config.get_main_option("sqlalchemy.url")
    context.configure(
        url=url,
        target_metadata=target_metadata,
        literal_binds=True,
        dialect_opts={"paramstyle": "named"},
    )

    with context.begin_transaction():
        context.run_migrations()


def run_migrations_online() -> None:
    """Run migrations in 'online' mode.

    In this scenario we need to create an Engine
    and associate a connection with the context.

    """
    connectable = engine_from_config(
        config.get_section(config.config_ini_section, {}),
        prefix="sqlalchemy.",
        poolclass=pool.NullPool,
    )

    with connectable.connect() as connection:
        context.configure(
            connection=connection, target_metadata=target_metadata
        )

        with context.begin_transaction():
            context.run_migrations()


if context.is_offline_mode():
    run_migrations_offline()
else:
    run_migrations_online()

Finally the way I setup my database is via docker, but I have the commands in my Makefile:

.PHONY: db-create db-start db-stop db-remove db-drop-extensions db-init

db-create:
    @echo "Creating database container..."
    docker run --name $(DB_CONTAINER_NAME) -e POSTGRES_PASSWORD=$(DB_PASSWORD) -d postgres

db-start:
    @echo "Starting database container..."
    docker start postgres

db-stop:
    @echo "Stopping database container..."
    docker stop postgres

db-remove:
    @echo "Removing database container..."
    docker rm postgres

db-init: db-create db-start
    @echo "Initialising database..."
    sleep 10

db-teardown: db-stop db-remove

Oh and for the purpose of testing my env.py looks like this:

export DB_NAME="postgres"
export DB_USER="postgres"
export DB_PASSWORD="postgres"
export DB_HOST="localhost"
export DB_PORT="5432"
export DB_CONTAINER_NAME="postgres"

After I run source .env I then execute the following make command make db-init. After which this gets printed to my terminal:

Creating database container...
docker run --name postgres -e POSTGRES_PASSWORD=postgres -d postgres
f32873ec9faf9c09bd0ea1e5ff1247dadd60c3eb0ed82b63a318813b27ca783f
Starting database container...
docker start postgres
postgres
Initialising database...
sleep 10

Now I have my database container up and running.

Finally when I try to run an alembic revision like alembic revision --autogenerate -m "add company and user models" I get the following error message:

INFO  [alembic.runtime.migration] Context impl PostgresqlImpl.
INFO  [alembic.runtime.migration] Will assume transactional DDL.
Traceback (most recent call last):
  File "/Users/myuser/my-app/.venv/lib/python3.11/site-packages/sqlalchemy/engine/base.py", line 1965, in _exec_single_context
    self.dialect.do_execute(
  File "/Users/myuser/my-app/.venv/lib/python3.11/site-packages/sqlalchemy/engine/default.py", line 921, in do_execute
    cursor.execute(statement, parameters)
psycopg2.errors.InvalidSchemaName: no schema has been selected to create in
LINE 2: CREATE TABLE alembic_version (
...
sqlalchemy.exc.ProgrammingError: (psycopg2.errors.InvalidSchemaName) no schema has been selected to create in
LINE 2: CREATE TABLE alembic_version (
                     ^

[SQL: 
CREATE TABLE alembic_version (
        version_num VARCHAR(32) NOT NULL, 
        CONSTRAINT alembic_version_pkc PRIMARY KEY (version_num)
)

]

What's going on here? It seems like Alembic doesn't have access to the right schema, but on the other hand I haven't explicitly set a schema so I'm under the impression that alembic should default to the public schema of postgres.

P4nd4b0b3r1n0
  • 1,951
  • 3
  • 22
  • 31

1 Answers1

0

I feel kind of dumb right now. The issue was that I had the postgres.app application open on my Mac at the same time with my Postgres database in docker. It seemed like alembic was connecting to that version of postgres.

After I closed the app, the migrations worked fine.

P4nd4b0b3r1n0
  • 1,951
  • 3
  • 22
  • 31