I\'m running a ruby on rails application in docker container. I want to create and then restore the database dump in postgres container. But I\'m
Below is what I\'ve do
You'll need to mount the dump into the container so you can access it. Something like this in docker-compose.yml:
db:
volumes:
- './db_dump:/db_dump'
Make a local directory named db_dump
and place your db_dump.gz
file there.
Use POSTGRES_DB
in the environment (as you mentioned in your question) to automatically create the database. Start db
by itself, without the rails server.
docker-compose up -d db
Wait a few seconds for the database to be available. Then, import your data.
docker-compose exec db gunzip /db_dump/db_dump.gz
docker-compose exec db psql -U postgres -d dbname -f /db_dump/db_dump.gz
docker-compose exec db rm -f /db_dump/db_dump.gz
You can also just make a script to do this import, stick that in your image, and then use a single docker-compose command to call that. Or you can have your entrypoint script check whether a dump file is present, and if so, unzip it and import it... whatever you need to do.
docker-compose up -d web
If you are doing this by hand for prep of a new setup, then you're done. If you need to automate this into a toolchain, you can do some of this stuff in a script. Just start the containers separately, doing the db import in between, and use sleep
to cover any startup delays.