Consider this a topic for the same. virtualenv bundles wheels for pip, setuptools and wheel which makes it tricky to add dependencies to any of these packages.
Long term, I think making pip default to performing isolated builds and having setuptools deprecate direct calls to setup.py might be sufficient here – it would allow virtualenv to stop bundling the other two packages, and only providing pip in the generated environments.
Are there any changes we can to make to our toolchain, in the meantime, to make it less tricky to add dependencies to wheel and setuptools?
This is a valiant thought, but not happening anytime soon (at least 3+ years - considering virtualenv deprecation policy). So in the foreseable future we need to add setuptools and wheel.
Why is hard/undesirable for boostrap packages to take on dependencies?
- All dependencies are always present in virtual environments, so the number of packages users can depend and not specify in their install requires will be growing as we’re going ahead.
- All dependencies will be present in all the created virtual environments, so there’s potentially a lot of disk space waste plus add some creation speed slowdown due to extra time needed to perform these installations
- virtualenv needs implementation of bootstrap packages having dependencies - this requires a rewrite of how we perform the installation and a rewrite of the auto-update strategy (because now we no longer need to update only our bootstrap dependencies, but also dependencies of the bootstrap packages) - this is a major effort
Neither of these issues are per-se tool-chain specific as far as I see, and not sure if we can address any of it.
That’s what long-term means.
I don’t think those packages should be stretching too much to accomodate for/be blocked by virtualenv’s (IMO) extremely… generous… deprecation policy.
I think the onus is on virtualenv to mitigate issues that stem from its support/deprecation policy if the assumptions baked into its implementation change (either by sticking with older versions, or adding the additional dependencies for a slowdown, or by adding more knobs to the CLI, or something else).
Of course, we don’t want to make willy-nilly changes that break things for each other, but the situation that wheel / setuptools can’t have dependencies because of how virtualenv+pip work isn’t great.
You missed half my point: the disk space/network/installation overhead and these packages being available in a virtual environment without needing to specify them as an install requires is there independent of what pip+virtualenv does. Unless we can come up with a self-contained binary for both pip/virtualenv/setuptools these will not change.
Disk space issues are inherent in how virtualenv works. I don’t want to argue endlessly about the “disk space is cheap” argument, but every development environment I set up has a virtualenv containing black, flake8, tox (with its own copy of virtualenv), pytest - plus all of their dependencies.
If we want to address disk space issues, we should be looking at ways to share common tools/dependencies, not vendor packages into tools or make tools reimplement features.
Similarly, if “dependencies of stuff are available without being explicitly specified” is an issue, we should be looking at ways to declare dependencies as “hidden”, not argue against dependencies.
Basically, we should treat packaging tools as use cases that motivate general features, not as special cases that get a free pass to not follow best practices because they are somehow “special”. (And yes, I do intend that same argument to apply to pip!)
If virtualenv wants to optimize these operations, that’s perfectly fine! OTOH, I don’t think it’s reasonable that other packages should be constrained by this design choice/prioritization in virtualenv.
And, Paul said everything else I wanted to say.
I appreciate the context that @bernatgabor has provided here on why we are where we are, and do know that it’ll take effort to make the “long term” changes. I think we all do agree on that (if you don’t, please go start a new topic/thread).
To avoid this thread for diverging further, I’ll quote main question I’ve asked in OP:
Getting that minimal wheel installer tool standardised and in the stdlib would help venv and ensurepip. Currently we can’t update 3.10 with the latest pip because of a change (in CPython and/or certifi) that breaks mounting the wheel as a zip file to use pip to install itself. More dependencies, and more wheels, won’t make that easier. Removing the need to use pip to install pip, would.
Fat wheels (with multiple top level packages) might be interesting, though honestly probably the best would be for pip to learn how to run from a single place and install into whichever environment is active. More generally, pipx-style handling of tools would be valuable, but at least not having to install pip in every new environment without losing the ability to install new packages into it would be great.