Non standard standardized library

We’re all well aware that unmaintained projects are bad. We’ve all been bitten by abandoned dependencies (and in some cases, been the ones demotivated into abandoning them). This problem has been around for decades without anyone really finding a good answer so if the goal of this half-proposal is just to highlight that there’s an issue then it’s a bit of a non-post.

Anything less than a properly thought out and reasonably airtight proposal to solve the problem will receive this same “Uh huh, now tell me how you intend to solve it” dismissal that everyone’s tired of giving.

5 Likes

This is very true; however, for there to be any sort of productive discussion, someone has to actually have a proposal that is being formed. Questions need to be answered, not deflected. You can’t open up a completely generic discussion and expect it to get anywhere. Imagine I said something like:

I think we should improve Python.

And then any time someone asks “Okay. How, though?”, I just say “Well, that’s not my place to say”. Would that be useful? Would that thread contribute anything, or merely waste everyone’s time?

This thread is, at the moment, in that sort of state. There’s nothing to discuss. But also, please keep in mind that people’s time is an extremely finite resource, and we participate in these discussions in the hope and faith that there’s something worth discussing. Empty threads result in responses like this:

and ultimately, to key people like core devs tuning out the entire Ideas section, because it’s simply not worth their time to discuss.

So. Here’s what I am thinking. If you (generic you, not a specific person) have a thought, but not any sort of proposal, then don’t post it here, post it to your own blog. You’ll still feel like you’ve made a post and “done something” that way.

Or if you want to actually have a productive thread, then be prepared to make actual decisions about an actual proposal, so that we have something to discuss.

4 Likes

OK guys, considering what you said regarding people giving up their packages, here is my new take on responding things in a way that ownership is kept and there is a low bar for something like this to be kicked off and possibly evolve at a different point in time.

Approval process
Who does the considering / what are the criteria / how to submit?

Ideally a small, dedicated working group (e.g., new or existing PSF-sponsored committee, similar to how the Steering Council delegates areas, I think). Criteria would focus on: proven ecosystem value, existing stability track record, maintainer commitment to the guarantees (multi-version support, API stability, deprecation plans), and broad community benefit. Submission could be a simple application process (like a form or document, see OPEP at the end) reviewed publicly.

What commitments does a maintainer make?
Exactly the ones outlined: support the latest 5 non-EOL Python versions, long-term API stability, clear deprecation/migration paths, and continued releases even during deprecation. No more than that, no forced feature additions or timelines beyond what’s sustainable.

If commitments can’t be met
What happens if a maintainer can’t continue?

The seal would be revoked (package reverts to normal PyPI status). No forced takeover. If the maintainer wants to hand off to new owners who can meet the commitments, then seal could transfer. The original maintainer would always have a say (e.g., veto unwanted transfers). This is explicitly not like stdlib inclusion, where ownership transfers to the core team.

Ownership
The package remains fully yours. The seal is a certification/badge of quality and stability (like a “PSF-endorsed” label), not an ownership change. It’s more like normal PyPI with extra prestige and visibility.

Incentives for authors
Why give up anything / continue maintaining?

You don’t give up ownership or control. Benefits could include: higher visibility (official promotion on Pypi), increased adoption/trust (users prefer “blessed” packages), potential PSF support (grants, mentorship, or infrastructure help for maintainers, although I don’t know much about the economics here), and a possible path for eventual stdlib consideration if desired. Many authors already maintain popular packages long-term for reputation/resume value and this would amplify that.

Governing body and technical decisions
Who is the governing body?

The PSF itself doesn’t need to make technical calls as they could appoint a technical group (e.g., PyPA + interested core devs) to handle it, similar to how packaging PEPs are managed today. If the package stops being noteworthy, the body may decide to revert the official status.

Core devs and process
Appointed core dev / OPEP?

No forced appointments. I understand core devs are volunteers and any involvement would be voluntary sponsorship (like PEP delegates). An “OPEP” is my suggested mechanism for safe evolution: a public proposal process (discourse + review) for changes in blessed packages, ensuring migration paths and version support. Defined collaboratively if the overall idea gains traction.

Other topics not addressed here:

software licenses?

Would this force the process into very stable projects only?

Naming, I’ve used works like blessed, official, approved. Not sure what is best.

4 Likes

How can people rely on the seal to mean continued maintenance and support if packages that are given it could have support ended and the seal revoked at any moment?

4 Likes

That is a good question

Revoking the seal would never be quiet or casual. It would go through a public process with lots of warning, open discussion, and official notes on places like PyPI. Losing the seal for dropping commitments would hit hard as a major reputational black mark in the python community, showing the maintainers bailed on their stability promises.

So having the seal, users get a solid signal on that package, even if it’s not an eternal guarantee.

What else could be done? I don’t think you could do a monetary penalty unless it is like Google or Microsoft maintaining the project.

The suggestion that volunteers must be penalised when they start contributing less than before is so… progressive.

We as a society should definitely be doing more of this.

So, the seal means “this project fulfils certain expectations, until it doesn’t”.

I’m still not seeing how this is different from what we already have. Reputations are built and can be lost. Why do we need a centralized authority to tell me that pandas is a dependable package?

Why do you think certifications exist then? ISO is a thing. It is 1000x easier to look for a check mark than to waste time investigating all details of a project.

Gives you a solid baseline. Not meant to be perfect.

Also not for everyone. You can rely on your reputation of you wish and that is fine.

If you need interoperability, sure! That’s why we have PEP 249, for example. But certifications don’t tell you “this is still going to be active in five years”.

But hey. You know what? Nobody’s stopping YOU from giving out a seal of approval to whichever projects you like, on whatever basis you like. Go ahead! Design a seal of approval, maybe even do up a fancy graphic, and issue it to the projects that you think are the best. The PSF does not have a monopoly on this, and quite frankly, I don’t think they are in a good position to pick out the best projects on PyPI (how many spare hours do you think the PSF has to throw at this?). So, go ahead, make this happen! Maybe you’ll be the big name that everyone looks up to.

2 Likes

I think it would be nice to have some kind of seal to show for long lived stable packages, as it would save you time from having to figure this out through other methods. I also think that some of that could be automated.

Suppose there was a bot that whenever a new PyPi release appeared, the bot would download the code, build, and run tests on every official Python version. The bot would maintain history for 5 years.

If the bot detected that:

  1. Builds and tests passed for every supported Python version
  2. Test coverage was 80%+
  3. Every public API was still compatible with the way it was at 5 years ago or when it was initially created (number and type arguments). In other words, you could keep adding optional arguments, but you could not remove or change anything that was in an older release.

and this was true for the past 5 years for every release, then it would give a seal of “stable package“ or something. Of course you would need some way to deal with buggy releases that get fixed, and some grace period for new Python versions appearing. Limits just for example purposes, not sure what would be best.

1 Like

Cool idea! Write it.

1 Like

Although I think there are good impulses behind this thread, and real problems, I don’t see this thread as offering a solution.

There’s a deeper problem than just “who will do the work”. I don’t think we want the Python Packaging Cabal sitting in a smoke filled room and picking winners and losers.
That sounds like the opposite of a healthy packaging ecosystem.

The idea of automated tooling which assesses packages appeals to me more, but the assessment technique needs work. Testing packages like a redistributor is generally hard. And there are other metrics to consider, like regularity of releases. I think this area has potential, but needs someone to pioneer some way of ranking and examining packages based on currently available data.

4 Likes

I think this is a situation where there is a real possibility for you (or someone) to try doing it yourself, and I’ve sometimes thought it would be a useful thing for someone to do. That is, rather than having the PSF do anything, just start making a list of packages you think are good and should be recommended. Then publish the list somewhere.

In fact people often already do this with blog posts along the lines of “Ten amazing Python packages you should be using” or “The most essential Python libraries” or the like.

The hard part, though, is maintaining the list by regularly checking that all the libraries are still good. But if someone would do that it could be very useful to the community.

3 Likes

FWIW, the Rust ecosystem has an unofficial site at https://blessed.rs/ for a similar purpose.

1 Like

There is some lists of projects like that:

You can add to this something like how stable is, or how much stability they project (for example if the mantainer is a serious foundation or just a solo developer, the track record, etc.)

On the converse side, I wish people would be more open to exploring lesser known/alternative packages. A lot of the top packages out there I’d consider to either have better alternatives available or be heavily used for tasks that could be achieved more effectively with another, usually more focused, library.

Any form of blessed list has zero hope of auditing all 699,158 packages currently on PyPI so will end limited to just the block buster packages which in turn makes it another deterrent for exploration.

4 Likes

I do not see this as a problem that should be solved by a central body (PyPI or PSF). A healthy eco system means that maintenance of packages is needed. Especially since Python has a very stable and predictable release cycle. In project.toml every package owner can now already specify which version is supported. In cases you hit a problem with a non supported package on a certain Python release, contact the maintainer(s) or help them. The current release cycle of newer Python version is imho very predictable. Changes on modules of the Python Standard Libraries are part of newer Python version and backwards compatibility issues are discussed in PEPs that come with a new Python version. Solve problems where they are created, if it is a real problem.