PyPI is slow. How do I run my own server?

廉价感情. 提交于 2019-11-28 16:41:22
aychedee

Do you have a shared filesystem?

Because I would use pip's cache setting. It's pretty simple. Make a folder called pip-cache in /mnt for example.

mkdir /mnt/pip-cache

Then each developer would put the following line into their pip config (unix = $HOME/.pip/pip.conf, win = %HOME%\pip\pip.ini)

[global]
download-cache = /mnt/pip-cache

It still checks PyPi, looks for the latest version. Then checks if that version is in the cache. If so it installs it from there. If not it downloads it. Stores it in the cache and installs it. So each package would only be downloaded once per new version.

While it doesn't solve your PyPI problem, handing built virtualenvs to developers (or deployments) can be done with Terrarium.

Use terrarium to package up, compress, and save virtualenvs. You can store them locally or even store them on S3. From the documentation on GitHub:

$ pip install terrarium
$ terrarium --target testenv --storage-dir /mnt/storage install requirements.txt

After building a fresh environment, terrarium will archive and compress the environment, and then copy it to the location specified by storage-dir.

On subsequent installs for the same requirement set that specify the same storage-dir, terrarium will copy and extract the compressed archive from /mnt/storage.

To display exactly how terrarium will name the archive, you can run the following command:

$ terrarium key requirements.txt more_requirements.txt
x86_64-2.6-c33a239222ddb1f47fcff08f3ea1b5e1

I recently installed devpi into my development team's Vagrant configuration such that its package cache lives on the host's file system. This allows each VM to have its own devpi-server daemon that it uses as the index-url for virtualenv/pip. When the VMs are destroyed and reprovisioned, the packages don't have to be downloaded over and over. Each developer downloads them one time to build their local cache for as long as they live on the host's file system.

We also have an internal PyPi index for our private packages that's currently just a directory being served by Apache. Ultimately, I'm going to convert that to a devpi proxy server as well so our build server will also maintain a package cache for our Python dependencies in addition to hosting our private libraries. This will create an additional buffer between our development environment, production deployments and the public PyPi.

This seems to be the most robust solution I've found to these requirements to date.

Take a look at David Wolever's pip2pi. You can just set up a cron job to keep a company- or team-wide mirror of the packages you need, and then point your pips towards your internal mirror.

Setup your local server then modify the local computer's hosts file to overwrite the actual URL to instead point to the local server thus skipping the standard DNS. Then delete the line in the host file if you are done.

Or I suppose you could find the URL in pip and modify that.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!