I recently wrote a small tool that aims at managing the dependencies of a Python application in a virtual environment. It is named pip-deepfreeze.
I thought it might be interesting for the packaging community, hence this post.
It lives in the same feature space as other environment management tools although, by design, it has a narrower scope than most. The implementation approach is different, as it relies on
existing PEP 517 build backends configuration to declare top level dependencies,
the pip CLI for dependency resolution and install/uninstall (plus some pkg_resources/importlib.metadata/packaging for dependency tree construction) ,
pip’s ubiquitous requirements.txt format to pin dependencies.
While useful in itself and hopefully filling a niche, it is also an experiment I made as a pip committer and PyPA contributor. Indeed I believe pip’s CLI, together with PyPA libraries that implement standards, should make it easy(er) to write such tools. As time permits, I plan to use the experience gained from writing this to inform the evolution of the pip CLI (e.g. in the area of pip reporting what it did/would do in a structured format, or in the area of upgrade strategies).
Here is an excerpt of the README, summarizing its main features:
installing the project and its dependencies,
updating the environment with new dependencies as the project evolves,
uninstalling unused dependencies,
refreshing dependencies,
maintaining pinned versions in requirements.txt,
pinning versions for extras in requirements-{extra}.txt
displaying installed dependencies as a tree.
A few characteristics of this project:
It is easy to use.
It is fast, with very little overhead on top of a regular pip install + pip freeze.
It relies on the documented pip command line interface and its ubiquitous requirements file format.
It assumes your project is configured using a PEP 517 compliant build backend but otherwise makes no assumption on the specific backend used.
It has first class support for dependencies specified as VCS references.
It is written in Python 3.6+, yet works in any virtual environment that has pip installed, including python 2.
It is reasonably small and simple, with good test coverage and is hopefully easy to maintain.
Nice! One thing I particularly like is the choice to make the tool run outside of the target environment. More user-facing packaging tools should work like this
pipdeptree · PyPI v 2.0.0 can also do this, I wonder how much the two differs?
pip is sadly is quite slow, but hopefully, the new dedicated installers can be faster. Can you quantify what slow means in your definition? 100ms, sub-second, sub 5 second, etc?
When I wrote this feature, pipdeptree needed to be installed inside the target environment. That was the main reason to write that subfeature at the time. It might also differ in details wrt handling of extras, and display of repeating identical dependencies, and its notion of top level requirements.
I’m afraid I can’t quantify that. The main usage scenario is running pip-df sync interactively each time a team member does a git pull that possibly affects locally installed dependencies. Let’s say it’s faster than some other tools I tried in that space, for the specific use cases of my group. And since it has little overhead on top of pip, and is build on top of the pip CLI, any performance improvements that will be done in pip will immediately benefit pip-deepfreeze.
BTW, about performance, the most painful perceived performance impact is finding the project name, which requires a PEP 517 metadata preparation in the worst case… Yay PEP 621
(Can we just copy requirements.txt to constraints.txt and strip the constraints from the original requirements.txt with e.g. sed on non-windows platforms?)