Alembic support for multiple Postgres schemas

前端 未结 3 1789
我寻月下人不归
我寻月下人不归 2020-12-15 06:25

How can I use Alembic\'s --autogenerate to migrate multiple Postgres schemas that are not hard-coded in the SQL Alchemy model? (mirror question of SQLA

3条回答
  •  [愿得一人]
    2020-12-15 07:23

    This is a pretty old question, but for anyone who ran into the same problem, I solved it by using the following in my env.py:

    def run_migrations_online():
    """Run migrations in 'online' mode.
    
    In this scenario we need to create an Engine
    and associate a connection with the context.
    
    """
    connectable = engine_from_config(
        config.get_section(config.config_ini_section),
        prefix='sqlalchemy.',
        poolclass=pool.NullPool)
    
    with connectable.connect() as connection:
        context.configure(
            connection=connection,
            target_metadata=target_metadata,
            include_schemas=True,
            version_table_schema=build  # <-- This is the relevant line
        )
    
        with context.begin_transaction():
            context.run_migrations()
    

    where build is a string that defines the desired schema name. My use case is slightly different (multiple distributed builds with a single database containing multiple identical schemas), however I was running into the same problem of alembic not correctly detecting the schema I was attempting to connect to.

    I use environment variables to determine the correct build, as it works quite well with Zappa.

提交回复
热议问题