How can I use Alembic\'s --autogenerate
to migrate multiple Postgres schemas that are not hard-coded in the SQL Alchemy model? (mirror question of SQLA
This is a pretty old question, but for anyone who ran into the same problem, I solved it by using the following in my env.py:
def run_migrations_online():
"""Run migrations in 'online' mode.
In this scenario we need to create an Engine
and associate a connection with the context.
"""
connectable = engine_from_config(
config.get_section(config.config_ini_section),
prefix='sqlalchemy.',
poolclass=pool.NullPool)
with connectable.connect() as connection:
context.configure(
connection=connection,
target_metadata=target_metadata,
include_schemas=True,
version_table_schema=build # <-- This is the relevant line
)
with context.begin_transaction():
context.run_migrations()
where build
is a string that defines the desired schema name. My use case is slightly different (multiple distributed builds with a single database containing multiple identical schemas), however I was running into the same problem of alembic not correctly detecting the schema I was attempting to connect to.
I use environment variables to determine the correct build, as it works quite well with Zappa.