I have what I assume (and have confirmed with a few of you wonderful folks) is a somewhat “common” setup for distributing $WORK packages.
We have a private index that we upload closed-source packages to for distribution to our own servers (with other dependencies using upstreamed PyPI). We’ve used various implementations of that over the years – these days, DevPI.
With such a setup you run into an issue – all your dependencies are now available to your servers, but what about if you want a requirements.txt (or whatever kind of lockfile your package manager cares about). Your combined private+PyPI index has lots of versions of things, and you want your servers to get specific ones. You likely are tracking those specific ones in requirements.txts, but those likely live in VCS, and don’t make it onto servers in the first place. So how do you end up installing specific versions on your servers?
Over the years, we’ve ended up answering that question in various ways. For illustration, I’ll use “server” to refer to where the app needs to be deployed, “build machine” to refer to some other outside machine that has access to both the VCS repository and the indices, “app” to refer to the thing you really want to deploy, and “dependencies” to be everything else, regardless of whether it comes from private or public indices. I’ll also assume you’re using pip, though perhaps that part doesn’t matter.
- Build “fat” directory away from servers and on a build machine, which then grabs all dependencies, and which then ships the combined thing to servers, which then use
pip install --no-index --find-links
to pull from that fat directory, and therefore find exactly the versions you’re looking for - Give up on requirements.txt, and pin exact versions in setup.py
- Use a single binary style mechanism (zipapp, pex, shiv, etc.)
- Give up on pure-index based deployments, and clone your VCS on servers
- Manage copying requirements.txt files out of VCS and into some intranet hosted (or non-intranet hosted) server-accessible side channel
- Grant your servers access and a mechanism to pull one file, the requirements.txt (e.g. via the GitHub API) from VCS, and then have it present when running
pip install -r pulled-requirements.txt app
- Kludge your private index to always contain exactly the versions of each dependency you want, and no others (assuming you’ve got just a few apps, and use the same versions of their dependencies across all apps)
I’m sure there are a few options there I’ve either forgotten and considered, or haven’t even considered at all. Definitely curious to hear from others on how they’ve solved the above.
Each of the above makes me feel… icky in one way or another, some more than others. Really what I’d like is (barring some even better idea), some index-integrated way of flagging packages as apps, and having them track requirements.txts for them, such that pip install app
made use of a server-side requirements.txt. Presumably such an extension / protocol would be used exclusively by private indices, since multiple folks might deploy the same app with different concrete dependencies, though perhaps that’s solvable regardless. And there’s anyways some trickiness to deal with here since you do need some buy-in from the package manager handling the install, since all the deps will be fetched in successive interaction with the index, not at the same time as app
. Not trivial I’m sure. But it occurs to me at least. Happy to elaborate on this idea if need be. But definitely curious to hear thoughts.
-J