Huge files in Docker containers

落爺英雄遲暮 提交于 2019-12-03 06:47:12

Am I supposed to include them in the container (such as COPY large_folder large_folder_in_container)?

If you do so, that would include them in the image, not the container: you could launch 20 containers from that image, the actual disk space used would still be 10 GB.

If you were to make another image from your first image, the layered filesystem will reuse the layers from the parent image, and the new image would still be "only" 10GB.

Is there a better way of referencing such files?

If you already have some way to distribute the data I would use a "bind mount" to attach a volume to the containers.

docker run -v /path/to/data/on/host:/path/to/data/in/container <image> ...

That way you can change the image and you won't have to re-download the large data set each time.

If you wanted to use the registry to distribute the large data set, but want to manage changes to the data set separately, you could use a data volume container with a Dockerfile like this:

FROM tianon/true
COPY dataset /dataset
VOLUME /dataset

From your application container you can attach that volume using:

docker run -d --name dataset <data volume image name>
docker run --volumes-from dataset <image> ...

Either way, I think https://docs.docker.com/engine/tutorials/dockervolumes/ are what you want.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!