Our Django project is getting huge. We have hundreds of apps and use a ton of 3rd party python packages, many of which need to have C compiled. Our deployments are taking a
Will it help if you have your build system (e.g. Jenkins) build and install everything into a build-specific virtual environment directory? When the build succeeds, you make the virtual environment relocatable, tarball it and push the resulting tablall to your "released-tarballs" storage. At deploy time, you need to grab the latest tarball and unpack it on the destination host and then it should be ready to execute. So if it takes 2 seconds to download the tarball and 0.5 seconds to unpack it on the destination host, your deployment will take 2.5 seconds.
The advantage of this approach is that all package installations happen at build time, not at deploy time.
Caveat: your build system worker that builds/compiles/installs things into a virtual env must use same architecture as the target hardware. Also your production box provisioning system will need to take care of various C library dependencies that some Python packages may have (e.g. PIL
requires that libjpeg
installed before it can compile JPEG-related code, also things will break if libjpeg
is not installed on the target box)
It works well for us.
Making a virtual env relocatable:
virtualenv --relocatable /build/output/dir/build-1123423
In this example build-1123423
is a build-specific virtual env directory.