My actual problem is that I am collaborating with people on a python CLI-application who are not used to python or its ecosystem. At some point we discussed the distribution of the project, which includes users installing it on their workstations and CI-workers using it for some automated tasks. I just wanted to publish it to our internal PyPI, since that would make the installation experience the best for the users. But they didn’t like the thought that that would mean pulling dependencies from the web, which might be different from the ones we pulled when we ran our tests, in particular for the CI workers where a break might block our workflow.
Now, here are some things I’ve considered, none of which seem like a clean solution:
- nail all dependencies to their patch level (can easily lead to unnecessary incompatibilities for users that don’t isolate the app, also post-fixes can still slip through)
- distribute the app as a docker image instead (the CLI includes some utilities that change the local file system, which is a pain to enable if it’s dockerized)
- instead of publishing to a package index, zip up a wheelhouse and have users curl → unzip → pip install its content (I wasn’t even serious when I proposed it, but my colleagues liked the idea)
- tell my colleagues that the python ecosystem is quite stable, and if it works for awscli to pull fresh dependencies, then it should be good enough for us (feels rude to just brush their worries aside, but it certainly is an option)
I’m probably not the first person to run into this issue, is there solution I’m not aware of?