Splitting packaging up into separately installable parts, and limiting its dependencies seems like a reasonable option to me. In essence, we’re saying that packaging is “packaging infrastructure” code, and to make it usable in tools that bootstrap the packaging experience, it can’t use non-stdlib libraries as freely as “normal” code does.
But I will say that the more we do this, the more we’re acknowledging a fundamental limitation of Python packaging, that we don’t (won’t, can’t) “eat our own dogfood” in the sense that we want to make it possible for people to use packages off PyPI, but we claim that our own code is special, and can’t do that. I was mildly uncomfortable with pip having to do that (pip is huge because of everything we vendor) but the case for pip not being able to depend on non-stdlib packages is much stronger than for other tools (the chicken and egg issue)¹. Surely any "problem for tools who
pip install wheel" is also a problem for tools that
pip install requests or indeed any other large package? That’s a strawman, but I genuinely would like to know what the real issue is here. Is it that virtualenv pre-installs a load of stuff and that allows people to use (say) pyparsing without remembering to explicitly install it? Or is it the download times (which pip’s cache should mitigate) or something else?
¹ Stronger, but not absolute - I still think we’d be better with a world where a very basic wheel-only installer came with Python and only used the stdlib, and we used that to bootstrap more capable frontends like pip, opening up the world for more competition in frontends, the way PEP 517 broke setuptools’ monopoly on backends.