How to ensure stable dependencies for a CLI application

My actual problem is that I am collaborating with people on a python CLI-application who are not used to python or its ecosystem. At some point we discussed the distribution of the project, which includes users installing it on their workstations and CI-workers using it for some automated tasks. I just wanted to publish it to our internal PyPI, since that would make the installation experience the best for the users. But they didn’t like the thought that that would mean pulling dependencies from the web, which might be different from the ones we pulled when we ran our tests, in particular for the CI workers where a break might block our workflow.

Now, here are some things I’ve considered, none of which seem like a clean solution:

  • nail all dependencies to their patch level (can easily lead to unnecessary incompatibilities for users that don’t isolate the app, also post-fixes can still slip through)
  • distribute the app as a docker image instead (the CLI includes some utilities that change the local file system, which is a pain to enable if it’s dockerized)
  • instead of publishing to a package index, zip up a wheelhouse and have users curl → unzip → pip install its content (I wasn’t even serious when I proposed it, but my colleagues liked the idea)
  • tell my colleagues that the python ecosystem is quite stable, and if it works for awscli to pull fresh dependencies, then it should be good enough for us (feels rude to just brush their worries aside, but it certainly is an option)

I’m probably not the first person to run into this issue, is there solution I’m not aware of?

1 Like

Depends on how sensitive/pedantic you’re about upstream projects breaking your CLI tool.

If you want something end user facing and favor stability over ability to automatically pull in bugfixes from your dependencies your best up pinning everything and creating a single runnable distribution. In this case you can use shiv, pex or pyinstaller. The first two create zipapps that package your project and all your dependencies, while the later packages even the interpreter in that package. You can even create docker image that does the similar: package the interpreter and your code into a single immutable container. However, now your users would need to install and understand docker to use it.

If instead you’d like your users to automatically pull in bug fixes (and bug additions) from your dependencies you can go down the path of shipping the tool as a CLI/library and instruct them to install it with pipx (or in worst case pip).

So you’d have to pick your poison, meaning which one you value more :slightly_smiling_face: stability or ability to pull in fixes automatically with a reinstall. In my experience developer tools generally follow the later, while end user facing/web apps do the earlier.

5 Likes

+1, great answer.

On top of this, Docker is often overkill for a plain-old Python app. If you use a statically linked CPython build and provide launch scripts for your users (that pass -I to your Python runtime) then you should have more than enough isolation. (On Windows, the ._pth file is designed to make this isolation even more robust for your kind of application, but we haven’t ported its support to other platforms yet.)

Agree 100%, and I’d argue that a developer tool that isn’t tied to the project at hand is actually in the former category. (For example, your linting tool probably varies by project, so let it be entangled with the project settings, but a compiler is more closely tied to the target than the project, so keep it stable and (as the distributor) take responsibility for updating it and its dependencies.)

4 Likes

@bernatgabor fyi, I decided to go with pex because it looked like the most established solution to our kind of problem, and got everything to work nicely (after a bug was fixed - lovely community, they helped me a lot with all my setup-troubles). Thanks for the tip!

2 Likes