If Python started moving more code out of the stdlib and into PyPI packages, what technical mechanisms could packaging use to ease that transition?

Currently, the the stdlib directory has higher priority than site-packages on sys.path, and none of our tools are prepared to mutate the stdlib directory, just site-packages. Of course we could change things if we want. But given how they work now, I guess the simplest approach would be:

  • Packages in state (2) or higher go in site-packages, because that’s the packages that can be upgraded, and only stuff in site-packages can be upgraded.
  • If we copy Ruby’s distinction between (2) and (3), then we’ll need some way to track which packages are in state (2), and update pip uninstall so that it checks the list and errors out if someone tries to remove one of the packages in (2).
  • And virtualenv and venv will need changes so that when you use --no-site-packages, they’ll prepopulate the new environment with all the packages in (2) and (3).

I guess it might be possible to allow the stdlib to depend on packages in state (2), since those are defined to always be available. But yeah, probably life would be simpler all around if we follow a rule where we always split out leaf packages first.

It might be OK if pip simply depends on those packages? Each new environment would get bootstrapped with copies of pip and all its dependencies. Then from that point, you have a working package manager, so assuming everything works correctly, your package manager’s regular dependency handling should prevent you from removing those packages (unless you also remove the package manager).

Of course, “assuming everything works correctly” might be too strong an assumption :-). We might prefer to vendor dependencies to make it harder for folks to shoot themselves in the foot by stuff like pip uninstall --force, or because we don’t trust pip’s resolver to actually work correctly yet.