Speaking from the perspective of a maintainer of downstream libraries there is one unnecessary thing that core Python does that contributes to the problem of users wanting all packages available from day one. The Python download page always defaults to suggesting the latest release of CPython even if that is only one day old:
Users who want to use Python along with various other packages will find it disappointing if they install the “default” version of Python but then many important packages are not available. Those users are not well served by being guided to install day-old releases of Python.
As @hugovk noted, there’s numpy 1.26.0b1 that supports 3.12 since about 3 weeks. Note that this is not an “average” new CPython release, because the removal of distutils has a large blast radius for libraries like numpy, scipy, etc. This by way of explanation why things are taking a while, despite people working on this with very high urgency.
Behaviour is covered by the same policy we have for Python code – PEP-387. It works pretty well: you can generally assume your code will continue working. It’s rare and discouraged for packagers to set “defensive” upper limits on the Python version.
I’d be all in for a stricter PEP-387. But I think ABI stability guarantees should stick to ABI.
Indeed, that’s a direction the stable ABI could evolve in. In fact, HPy essentially does this today.
But, I’d like to support stable ABI in the reference implementation of Python, and as far as I can see, it doesn’t hinder development much more than API stability guarantees.
Going back to vim: I’d love hear your thoughts on how desktop applications should handle Python scripting/plugins. IMO, we should have a lighter-weight way to do that than each such project becoming a redistributor of Python, and e.g. re-releasing each new Python security fix. (Ask the release managers how painful it is for CPython to bundle OpenSSL!)
Allowing users to use a Python they already have sounds like a good way to go for me. It’s not perfect yet of course, but it’s a good direction.
Oh. Thanks!
If it was me I’d write a PEP. I don’t really understand why one is not needed for such massive C-API reorganizations.
Sorry that you feel the pressure, but AFAIK others want to improve this situation. A new release of Python should just work. Sure, it never was that way and maybe we’ll never get there, but that doesn’t mean we shouldn’t try.
Perhaps there’s something we can do to reduce the pressure on people who care about other things?
Yet, it has users that are very happy about it.
Neither does it help the homebrew install crowd, nor the [click here to Download Blender] crowd.
It’s not just building, and it’s not just wheels.
I agree with Victor here. It may not be practically possible to start working on wheels before the final Python release. One reason can be that dependencies (such as Numpy) are not ready. Another reason is that Python RCs are not available in most distribution channels (such as conda-forge, etc.). Actually, even the final Python may take several weeks to be packaged in those distribution channels.
You cannot really ask package maintainers to go out of their way and implement a different build or testing procedure for Python RCs (or betas) than the one they use for released Pythons.
And of course if the new Python version requires changes to the package’s source code to maintain compatibility, then the new wheels will lag even more than if a mere rebuild had been sufficient.
I wish this would happen too, but work seems to have stalled since Feb 2022. While NumPy is an attractive target, perhaps it’s really too complicated and HPy should have started with a simpler project?
Right, but this effort is not going to go away by using the stable ABI.
Maintainers will still have to test their packages with the new Python release and fix any issues they find. Note that packages typically do include Python code as well, which doesn’t magically continue to work because the included C extension used the stable ABI
FWIW, I don’t think it’s a good idea to tell users: hey, look, you can continue to use the packages released for Python 3.11 with Python 3.12, since the C extension uses the stable ABI, but without actually testing the package with 3.12.
Likewise, users should be made aware that packages they are installing with pip install may actually not be tested with the just-released new Python version.
Making it easier to port C extension packages to new Python releases sound like a much better plan, since then the effort for the maintainers is materially reduced and not just postponed.
Which is why I think effort on the core dev side is better spent on projects such as your compatibility tooling , rather than maintaining two variants of the same API.
I have already stated my opinion on this: the desktop application maintainers are in charge here. For Linux distributions, esp. the paid ones, the distribution companies should do this kind of packaging and relieve the maintainers from these tasks.
I know that handling OpenSSL upgrades is painful (we maintained a client-server product using Python and OpenSSL for many years), but that’s mostly due to the OpenSSL side of things, not so much because Python makes this difficult.
If there are many such desktop applications, perhaps the maintainers of these could join forces and create a distribution of Python which is geared towards making embedding easy and painless for them. I don’t think this is something the core dev team should be taking on.
This is an interesting point. People with experience know to pin their dependencies, test their apps automatically, wait for x.y.1 release before upgrading, but someone getting started may find the official download page and get the most recent release when the paint is still fresh on it. What do people think about reorganizing the page slightly so that the current version is above a very recent version?
As a point of reference, the python stub that ships with Windows doesn’t switch to the latest release until there’s “broad community support” for it, which is a deliberately vague definition to allow us to judge the situation around each release (and avoid having people try and game it).
I suspect it only shifts the goalposts. Right now, package maintainers are able to release compatible binary wheels as soon as Python RC1 is released (and made available to build tooling), so X.Y.0 already has some buffer. Making the default switch over later hurts increases the buffer (which there may be an argument for, but then I would say it’s easier to extend the RC period).
I think one problem here is that package maintainers aren’t aware RC releases are out simply because they have better things to do than keep up with Python pre-releases, and the first they are reminded of a new release is when users ask about support for it, which might be when X.Y.0 is out. My suggestion would be for us to be more proactive in notifying maintainers (I have an idea).
PS: I think this discussion is off-topic with respect to the limited C API usage in the standard library, and could be split off into a new thread
I will not personally do that. I will prepare for the new release of CPython and have it tested through the alpha, beta etc stages in CI. I will not push out the release of my package claiming compatibility with CPython X.Y though until I can see the build complete and tests pass with the final release of CPython X.Y.0. The time at which I issue the release to support a new CPython version is always going to be after CPython issues its release. How long it takes depends on a bunch of factors because there are almost always some other things that need to be considered at the time even if everyone involved is not just busy with entirely different things.
The last change was on May 11. Since then, we’ve worked on upstreaming the necessary changes to HPy and making it work with GraalPy and PyPy. We presented our results at EuroPython this year. It’s not stalled.
Indeed, HPy could solve many of the problems discussed here. For CPython, it can remain a separate PyPI package, acting as a shim translating HPy to the CPython C API. The only thing necessary for a new CPython release would be to ensure this shim continues to work. 3rd party packages using HPy would need no recompilation. HPy is intended to be a smaller subset of the whole CPython C API and is more abstract, so we believe that its binary compatibility can be easily maintained. Moreover, its design is tailored for providing long-term ABI compatibility. There will always be packages that require a lower level or too CPython-specific APIs, and that’s fine; they will continue using the CPython C API. However, the vast majority of packages can function with the HPy API (we believe that porting NumPy and a few other smaller packages showcases that).
Additionally, I think a smaller “clean room” API would be a good target for better “standardization.” That is, a specification of the contract of the API, detailing what is supported and what is not, documentation and tooling. The HPy design allows one to run the same binary in a “regular” mode and in a “check that I am not breaking the contract” mode. The idea is that unless your code works in the “I am not breaking the contract” mode, you cannot count on ABI compatibility, so one can add even relatively expensive checks that prevent abusing the API and seal it for future development without affecting the “production” performance.
We’ve reached different personal opinions, so I’ll only answer a part of this post:
Yet, that’s what happens for most pure-Python libraries. Why should native extensions be different? PEP 387 applies regardless of stable ABI: if something breaks without deprecation, it’s a bug in CPython (or an explicit exception).
FWIW, a hypothetical PyPI build service won’t help much with this issue. We might need a test service. And/or perhaps metadata that would allow pip to warn “this package wasn’t tested on this version of Python”.
UPDATE: I gave a talk at the Python core dev sprint at Brno, my slides: Python C API (PDF). I elaborated the benefits of the limited C API, why I consider that the stable ABI will be a key of the C API success next years, and elaborated how using the limited C API for some stdlib C extensions will increase code coverage and test coverage of the limited C API.
I created a new PR to build the _stat C extension with the limited C API: PR #110711.
@vstinner looks like there will be a bunch of stable ABI packages via nanobind in the future. These are projects which transitioned from prior pybind11-based implementations). For example, JAX (a machine learning framework by Google) ported the bindings of the C++ component and specifically referenced stable ABI support as a reason to do so. Similarly, FEniCS (a popular finite element solver) and Google-Benchmark just went through similar porting efforts.