I recently stumbled over packaging issues again since I wanted to migrate my command line applications from setup.py
to the new setup.cfg
.
I had a small discussion with @sinoroc because setup.cfg
does not allow an external file as a dependency for install_requires
. While this makes sense for a library which is used by other applications it doesn’t make sense for an complete application to ship without pinned dependencies.
So imho there are two fundamental different use cases:
- distributing a library
- distributing an application
and both have contradictory requirements.
For an application I go through great lengths (CI, local tests, etc.) to ensure that everything works as expected and as an application user I expect that the application installs with the correct and tested dependencies. Of course as a user I should be able to update the dependencies manually but the first installation should always be a working (and tested) one.
Packaging an application
As an application developer I’m interested in defining my used dependencies only once and currently I use a requirements_setup.txt
. The file content is is used for CI, local testing and of course currently in the setup.py
in install_requires
.
I have an additional file requirements.txt
for people who want to contribute which adds some testing dependencies and references -r requirements.setup.txt
.
That way I have a single source of truth for the both the dependencies needed for a successful installation and for developing and running the tests.
Since I can’t use the the file my current setup won’t work with the setup.cfg
.
Not pinning the dependency versions is also not an option because countless times dependency changes have broken the application.
So with the current state of tooling the only option would be to manually copy the dependencies back and forth between the requirements_setup.txt
and setup.cfg
which is impractical and error prone.
Installation of the app
Since my target is small computers or linux machines like raspberry pis it’s okay to assume that a more or less recent version of python is already installed.
Still I have to document the installation:
- create folder for application
- create venv in that folder (in the subfolder venv)
- activate the venv
- pip install my_application
- run it with
my_application
(via entrypoint)
I’ve discovered in this thread pipx
which seams nice but then the installation flow would be
- create folder for application
- pip install pipx (because it’s nowhere available)
- pip install my_application
- run
my_application
Which is not much better.
Also for me it’s unclear how I can install the same application twice in different folders (e.g. a dev and stable version) Also currently with pip I can provide users with git links for tests/hotfixes and they can just install it in another folder and test which is very nice and convenient.
Improvement
I’d really love to have some standard tooling for installing/publishing libraries and installing/publishing applications.
There are wonderful tools already available but still everything is hard because
- I don’t know all the good tools
- The tools still are missing on the users computer and have to be installed beforehand
Python claims to have “Batteries included” so why not include an easy and convenient way to install applications. Including simething like pipx that adds a way to install an application locally in a folder (with a venv) would be a good start and already simplify things.
For developers differentiate between libraries and applications and have appropriate tooling.
Also patronizing developers by not allowing application owners to specify a file as dependency will just lead to frustration. People will hate the manual effort and do it anyway.