Non standard standardized library

We all know the problem with adding new libraries to the standard library: it slows down their development, and new features only become available years later, or users have to install separate packages that backport the functionality.

A clear example is typing-extensions, whose docs state:

Enable use of new type system features on older Python versions

Another consequence of the standard library is that modules there never just die abruptly. Instead, you get a proper deprecation plan, recommended replacements, migration guides, and plenty of time before any removal (if removal ever happens).
On PyPI, by contrast, you can be happily running on Python 3.12 one day, and the next day you can’t upgrade because 8 of the 75 packages you depend on no longer support the new version and some will never be updated.

This forces major refactors, blocks upgrades, and makes adding any new dependency a gamble. Even projects from large organizations often include a significant number of abandoned packages.
I could go on with more examples, but instead I’ll stop and offer a concrete suggestion:
Create a category of PyPI packages that carry a PSF seal of approval. This seal would signal that the package is effectively part of Python, on par with the standard library, but with additional guarantees:

  • It will always support the current five non-EOL Python releases simultaneously.
  • It will maintain stable APIs for a long time.
  • It will have clearly identified maintainers.
  • It will provide explicit deprecation and removal plans with well-defined migration paths.

Even during deprecation, new releases should happen to support newer Python versions.

One major advantage of this approach is that new proposals for standard library additions could first be directed to this “PSF-approved” tier, allowing them to mature with real-world use and stronger stability guarantees before any consideration for inclusion in the actual stdlib.

1 Like

So this has STRONGER guarantees than the current standard library, but you feel that proposals should go there first? How would that work?

Currently, people often get advised to put it on PyPI first, because it gives them the freedom to have fewer guarantees. And there is often pushback because there’s a strong feeling that getting something off PyPI is more effort than using what’s in the stdlib. How will your proposal improve this situation?

1 Like

Who maintains these “PSF approved” libraries? What if the current maintainers don’t want to follow the policies you describe for PSF approved packages?

4 Likes

Addressing both questions above

  1. It will have fewer guarantees than std lib. Things can evolve, be replaced as long they give a clear path of migration
  2. I said PSF, but I really don’t know who the backer is. Like PEP have core devs as sponsors, these could as well have someone, with PSF best interests, with power to act if needed.
  3. You create your normal package and it could (emphasis here) graduate into a, let’s call official package. It would be mandatory for any package that wants to be considered to std lib although it might not be the goal of the project.

How’s this different from “You create your normal package and it could graduate into the stdlib”?

Because graduating into stdlib may not be a goal of the package. The guaranteed maintenance would attract people to the package.

If some day it decides it would be nice addition to stdlib, you have a guarantee strong package with likely a high user base.

It could also be used for the opposite effect: people would prefer to have an “official package” rather than a stale std lib package. Thus it is less burden on the people actually working on core.

You stated:

So you’re starting from the perspective that a proposal has come forward to add a Spamination module to the standard library, and currently, we might say “put it on PyPI first”, to which the response is often “it wouldn’t get much use if it were on PyPI”. (Which is entirely valid, and is an unavoidable consequence of Python’s and PyPI’s success; there are bajillions of packages there, and finding the right one IS hard.) According to your proposal, this could instead garner the response “put it on PyPI, in the PSF-sponsored tier, first”.

Does this have stronger guarantees than the stdlib? Weaker guarantees? I am not even sure what the proposal is any more.

Again, how is this different from what we currently have? If a package is living happily on PyPI and ends up migrating to the stdlib, it is already a strong package with a high user base. What is the advantage of the PSF-approved tier?

There have been a number of proposals over time to improve discoverability on PyPI, and discussions along those lines are always welcome, but what exactly is the proposal here? Notably, improving discoverability for a small and curated set of packages would be of very minimal benefit, since most people’s packages would never benefit from it - and as soon as one package gets given the PSF’s mark, every similar package would be instantly penalized.

Yes. That is a good thing. If you want to be considered, you must at least be greater.

Or don’t you think adding modules like zstandard to std lib kills the competition as well?

It is a trust thing. I want that.

2 Likes

So let’s be specific. Who is doing the “considering”? What are the criteria for acceptance? How do I submit one of my packages to be in this “PSF approved” list? What commitments do I have to make in order to be accepted? What happens if I find that I can’t continue to keep the commitments involved? Does my package get removed from the list? Does someone else take over my package? Do I have a say in the matter? Is it even still my package, once it’s on the “PSF approved” list (because if it goes into the stdlib, I hand over ownership - is the PSF list “like the stdlib” in that regard, or “like normal PyPI”)?

4 Likes

It’s probably worth pointing out that even Numpy (which is pretty close to the non-standard but standard baseline that every other numeric library is based on) doesn’t even manage that - it currently supports 3.11, 3.12, 3.13, 3.14.

That’d practically mean that nothing else in the Scientific Python ecosystem could achieve this, because they’d all depend on Numpy.

5 Likes

I think all these questions should be answered in a PEP for sure. Without margin for guessing.
I’m in favor of ultimately the PSF (or some other body - let’s call governing body) owning the package as if it is an integral part of python itself.

It must do some concessions to avoid picking up a lot more work.

Let’s start with a simple example: toml is integral to python nowadays. There was a recent toml 1.1 release of the spec that may take at least 5 years (likely way more) to be usable. That is, if python even think it is relevant and all to update from toml 1.0 on std lib. We’ve seen this happen with yaml…
So let’s say the governing body decide that the official python.toml lib shall work on both standards with a flag. It will work with the maintainers and the appointed core dev to make it happen. It could be through a simplified OPEP (official package enhancement proposal)

A PEP is just a document. It can’t magically answer these questions. How will it know? It will know because you, the author of the proposal, answer those questions. :slight_smile: So you may as well figure out what the answers are here, in this thread, as currently there isn’t a proposal concrete enough to write up in a PEP.

4 Likes

There currently are libraries outside the standard library which are somewhat affiliated with the PSF (although I’m not sure of the exact arrangements how).

There’s the PyPA collection of packaging libraries (Python Packaging Authority — PyPA documentation). Although arguably many are being increasingly superceded by tools outside the list (i.e. setuptools is no longer many people’s first choice for new work. And has been the subject of recent arguments about API stability).

In the type-checking world there’s mypy (which is directly under Python · GitHub). Although again I think it’s no longer the one-true-choice of type-checker that it once was.[1]

So it does suggest the type of affiliation you propose does exist in places, although not with the quite the degree of enforced stability that you’re after.


  1. I’m not an enormous type-hint fan personally so not 100% up-to-date here ↩︎

2 Likes

This is not a proposal… yet.

I wrote specifically the word “suggestion” on my original post. I have issues with packages simply dying and would like a better way to pick packages that have a higher chance of lasting or clear paths for evolution.

As this has the idea tag, I would first wait if there is traction to turn a suggestion into a proposal. All 3 people which interacted here seemed to not like the suggestion (although the numpy argument above is pretty much proof we need something to address this)

Yes, the PyPA org came to mind. Last year event regarding setuptools breaking millions of packages and deploys made clear it has governing issues. Plus with alternatives like Astral’s uv, it seems there’s room for even recommended things be superseded.

I see what you mean.

Yet having PyPA is 1000s times better than not having it. It will still guarantee some safe haven if the other 3rd party solution gets acquired, killed, dismembered, or whatever.

I think they need to be answered here, by you, as unless you get some sort of support here there’s no point in writing a PEP, and you won’t get support if you keep deflecting requests for specific details.

So why would a package author choose to give up ownership of their package? What’s in it for them? Why would they continue maintaining the package if they no longer owned it?

I ask again. Who is that “governing body”? The PSF aren’t a technical organisation, they can’t make decisions like this. Also, what’s the “official python.toml lib” here? The tomllib module is already part of the stdlib. If I assume this is just an example, but it’s frankly not a very good one.

What “appointed core dev”? Core devs are volunteers, you can’t “appoint” them to work on anything. And what is this OPEP process? Who defines it? How does it work?

To be blunt, at this point it’s pretty clear you don’t actually have a proposal here. At best you have some wishes with no clear idea of how you’d make them happen (and yes, it’s you who has to make your idea happen - you can’t assume others will do so).

Either provide some specifics, or there’s simply nothing to discuss here.

You make it sound like this is a bad thing.

And I did state exactly that in reply to someone above.

I’m trying more to light up a discussion regarding abandoned projects and what it causes.

AFAIK this is an “ideas” forum. If it is not the best forum for that, let me know.

I would love if there are inputs in what else can be done, not for me to draft an air tight proposal just for it to be shut down because I might not have a full picture.

5 Likes

I think this response comes across as unnecessarily dismissive toward people who are still exploring ideas and looking for constructive discussion. Not every idea starts as a fully formed proposal, and allowing open, good-faith exploration is consistent with collaborative development that the PSF CoC encourages rather than disrespectful of anyone’s time. If an idea lacks traction or feasibility, that will become clear naturally without the need for abrupt gatekeeping.

9 Likes

I think the big unanswered questions are:

  • What are the benefits for package maintainers? Joining this program involves making extra commitments to future support, giving up your decision making ability about your project, and making your continued access to your project subject to the PSF code of conduct.
  • What is the benefit to the PSF? Presumably it has to devote some time or money to the maintenance of these projects (at least in the case that the original maintainer doesn’t meet their commitments).

I think “to what extent should the PSF be involved in the broader Python ecosystem?” is a reasonable question. But it doesn’t feel like there’s a big reason for anyone to go for this.


Taking my Numpy example, which

although the numpy argument above is pretty much proof we need something to address this

I’d argue the opposite. Numpy is a relatively well-maintained and resourced project[1] that takes a fairly careful approach to provide long-term support and avoid dramatic changes. And they’ve come to the conclusion that their support window should be shorter that the one proposed here.


  1. I’m sure they’d claim otherwise of course… ↩︎

4 Likes

I’m sorry if it seems that way. But my questions were all genuine - I really don’t understand what the proposal is here.

Maybe I’m assuming too much knowledge on the part of the OP. If so, then I apologise. If you read through any of the other discussions here, you’ll see fairly quickly that the person who starts a discussion is typically expected to manage the conversation, providing at least some details, even if they are rough or speculative, at least initially. And if you’ve done any research on the management of PyPI, you’ll find that there have been various discussions in the past on having a “curated” subset of PyPI - those discussions have always ended up going nowhere because no-one could answer the question “who’s going to do the work, or provide the resources?” This proposal is basically the same as those other ones, and my questions are intended to determine how things will be different this time - does the OP have a new idea on how this “PSF approved” subset of PyPI will be managed, or are we going to just be going over old ground here?

I certainly don’t mean to discourage the OP. But on the other hand, I’m getting increasingly frustrated that all of my (IMO legitimate) questions are being dismissed as “we’ll work that out later”.

Maybe I’ll just drop out of this discussion until “later” finally arrives :slightly_frowning_face:

4 Likes