If we move packages out of the stdlib, who maintains them?

@njs’ idea of bundling uninstallable wheels for the stdlib packages is good, but I want to take a small step back and try to see things from the user perspective. I want to talk about maintenance responsibility and distribution. Distribution is probably easier, let’s go with that first.

Distribution

Once a distribution of Python is created, users only see what is in the distribution, not how the distribution came to be. On Windows it is somewhat common to find downloadable distributions of Python that include tons of stuff, like Python(x,y), WinPython, Anaconda distribution, Enthought Python Distribution, ActivePython, and I’m sure there are others I’ve forgotten.

So, hypothetically, if upstream Python detached the stdlib, these distributors could just add them back in (along with all the other deps they bundle) and users of those distributions would be unaffected. WinPython, for example, even includes a GUI package manager that users could use for upgrading or uninstalling those same ex-stdlib modules.

I think that’s interesting: the act of putting together a distribution creates what users see. (For deploying Python apps to non-devs, it is already required that developers must make some kind of mini-distribution, or require that downstream users have Python installed, and this whole domain is actually not that great for anyone, been so for years. There’s maybe an opportunity here to make an excellent bundling story for the Python interpreter in general).

In principle it could be possible for Linux maintainers to also add back stdlib modules into that platform distribution of Python, and likewise with homebrew. (There are already many 3rd-party packages installable through Linux package managers, so these are considered to be “included in that
Python distribution”.).

Sometimes users choose certain distributions based on trust.

Some enterprises and educational institutions will only trust a distribution that comes from python.org. In this case, it’s possible for an “officially blessed” Python distribution to be created (by re-bundling the now detached stdlib back in) and offering the result available for download at python.org. It is even conceivable (though probably contentious) to bundle in popular and highly-regarded community packages into the same distribution for download at python.org (and @njs’ suggestion of wheel bundling for all these packages would be how to construct the bundle).

I think that the “making a distribution bundle” of Python that includes what we now call the stdlib, and maybe other things, could be fairly simple at a technical level. As I said, there are a bunch of such projects already available.

The other issue is a much harder problem.

Responsibility for Maintenance

I’m thinking more about responsibility, not actions. As we all know (detailed in PEP-594), maintenance of the stdlib is a lot of work for the core devs. It isn’t only the maintenance work itself, it’s also the carrying of responsibility for that maintenance burden, especially for volunteers.

Responsibility matters because the concern from the community might be “if they (core devs) don’t maintain these ex-stdlibs, who will?”. And I think it is likely that were the stdlib unbundled, some of libs might well go unmaintained; especially for the older stuff, and especially for stdlibs superseded by PyPI packages.

A similar question might be “if the stdlib were detached from core python, but then later re-bundled into a distribution for download, who will then be responsible for those libs included in the bundle?”.

I don’t know how to answer that question, but I do feel it is wrong for the community to demand that core devs spend their precious volunteer time to work on things they might not want to do, like maintaining older batteries that very few people use. I’d guess that any desire to maintain nntplib falls off a cliff shortly after detaching the stdlib, even if it is later re-bundled into a downloadable distribution :slight_smile:

In the simplest case, suppose the stdlib were unbundled from core python, but then an “officially blessed” Python distribution (that bundled those now-on-pypi stdlibs) was put up on python.org: would there be an
expectation that core devs are also responsible for everything in the bundle? I would hope not. That isn’t how Python(x,y) or WinPython works, or even when 3rd party packages get bundled into Linux distributions. But: it would probably impact the trust relationship for a certain group of users.

So, if the stdlib were unbundled (put on github/pypi), how would expectations from those users need to change, if at all? Does the responsibility move to the distribution-creators? Or do these now-on-pypi stdlib packages simply become exactly the same as all other PyPI packages, with their own maintainers and their own github repo and so on? I feel like if detaching the stdlib happens, then this, i.e., PyPI, is the right way to go, but it definitely requires resetting users’ expectations somehow.

The idea of showing deprecation warnings if a user is using a stdlib library without declaring it in setup.py is fine (good even!), but does nothing for managing the trust relationship (“I use only stdlib libraries because I trust the core devs”) and resetting users’ expectations about how these libraries are maintained.

1 Like

(Hi Caleb! I split your post out into a new thread, because Discourse threads work better when they stick to a pretty well-defined topic, and I don’t want your post to get lost in the technical nitty-gritty in the other thread.)

I can’t take credit for it :slight_smile: It’s come up lots of times in various forms.

If we did start moving stdlib modules into separate packages, I think there’s no question that we’d keep bundling formerly-stdlib modules into all standard python.org-blessed distributions, at least for some transition period. It’s unavoidable if we want to avoid breaking the world.

For any packages that we actually bundle with Python, then sort of by definition we’d be taking responsibility for their quality. That can mean a lot of things though – it could mean “we know that this is a dedicated team who understands their users’ needs better than we would, and we vouch for them”. (Like, the core devs agree that you should not use the standard library urllib, you should use the third-party requests or urllib3, if you trust the core devs, then you should trust us when we say that the library we maintain is not as good as the library those folks maintain.)

It could also mean “this was already completely unmaintained in the stdlib, and we’re working on making that more visible by unbundling it and just trying to give it a softer landing than if we deleted it outright”.

I expect there’d be a range of possible ways to work out the details, from keeping packages inside the main cpython repo but generating wheels from them, to keeping them in the python org, to handing them off to third party maintainers with more or less oversight… ultimately a lot of the point of doing this (if we do it) is to let us move away from the “one size fits all” model we currently have, and manage modules in ways that works better for the individual module.

2 Likes

The key problem here, that we see regularly in the packaging world, is that if someone makes a “distribution bundle”, their users expect to be able to get the latest versions of what’s in there. And if, for example, the bundle is slow at updating, users start looking for ways to upgrade for themselves. At that point, managing a distribution becomes a very complex technical problem, as you have to handle interactions between your package management, and other tools (like pip, setuptools, etc).

Curated distributions are great, but they do have their own issues.

Not just responsibility, but perception of responsibility. As things stand, the point of PEP 594 is that in actual fact these modules are already unmaintained, because the core devs don’t work on them. But there’s a perception (backed up by an implied commitment from the core devs) that they are, and the “maintenance burden” is largely one of managing that perception (working through issues that you don’t expect to fix, closing bug reports that have been left open with no action for years, fixing test failures in modules you don’t care about…)

Based on what I said above about distributions, the answer to that is simple - the providers of the distribution. This is why I don’t think having a “python.org blessed” extended distribution will help. Users will still expect support from the core devs, just “as the distro maintainers” rather than “as the core devs”.

The only real way that I see to achieve a change of (perceived and real) responsibility is to remove the modules from the stdlib, and someone else to bundle the removed code into a PyPI package (which may in turn be included in an “extended distro”). That “someone else” is then clearly responsible for the package. But being a package maintainer is hard (a lot harder than just contributing to discussions on a mailing list!) and few people have the time or inclination to do so - even for a package that they rely on. That’s not a criticism, so much as just a reflection of human nature. But it is something we have to consider.

agree 100% with everything you said.