The stable ABI is an obstacle to improving CPython.
The limited API, and the API in general, is also a problem, but much harder to fix.
Let’s keep the limited API, at least until we have a completely new C API in 2030 or whenever.
I also think that we want to keep ABI stable within releases (no ABI changes after late beta/RC).
The big problem with the stable ABI is that it prevents a number of otherwise quite reasonable changes.
Re-ordering fields in structs, or adding new fields.
Alternative reference counting implementations (see PEP 683)
The stable ABI was never designed as an ABI; it just ossified parts of the CPython implementation at the time.
Is it really worth the cost in time and money that worse performance costs all Python users, just to save a few extension authors the minor inconvenience of re-compiling their code once a year?
Both issues you talk about could be solved by a one-time compatibility break (an abi4) rather than removing stable ABI entirely. Specifically: to make those changes, the remaining runtime structs (PyObject & PyVarObject) would need to become opaque.
Yes, for sure, open source is an essential aspect of any system that would entail guaranteed periodic re-compiling of code by extension authors. However, please allow some further elaboration by one who is admittedly not at all involved with the Core Development group.
If an outcome of this discussion is that periodic re-compilation will, in fact, be required, then perhaps a table of successor authors could be maintained, with each extension listed as a key in the table associated with a list of volunteers, who either agree to pick up and occasionally enhance the extension, if it becomes appropriate, or simply just accept the responsibility to re-compile it whenever necessary. When a extension becomes near due for attention, a message could go out to the current author. If there is either a negative reply or none at all, then the voluntary successor authors would get contacted, with ones who agree to offer upgrades, rather than simply re-compilation, given first priority.
The above might smooth the bumps when authors become permanently unavailable, and could potentially identify volunteers to continue to supply updates.
Not sure. Does anyone have data on how many abi3 wheels there are on PyPI compared to minor version-specific wheels to measure the impact? We all know the Qt examples like PySide6, but without at least an initial measurement we would be guessing at impact with zero guidance.
I’m strong -1 on removing the stable ABI. It would impose a huge maintenance burden on PyPI packagers with binary extensions.
A good example is cryptography · PyPI . The package provides binary wheels for several combinations of CPU archs, platforms, manylinux, and musllinux targets. There are currently 11 cp36-abi3 binary wheels for each release. Without stable ABI @alex_gaynor and @reaperhulk would need to build 66 binary wheels for Python 3.6 to 3.11. And they only have wheels for X86_64, aarch64, and one win32 wheel for x86. The package does not provide binary wheels for PPC64 and LoongArch.
That would have a serious negative performance impact on reference counting. I think requiring re-compilation is the better option.
HPy seems to have this in their design already. With their universal mode, you get ABI compatibility, and with their non-universal mode you get runtime-specific performance features. This lets extension authors develop against one API/ABI but reap the benefits of both worlds: all they need to do is decide which builds to make available.
+1 on removing the concept of the stable ABI in its current form. We should still keep changes to the ABI to a minimum (as we’ve always done) and be careful with changes that require a recompile, but not outright ban such changes going forward.
As a compromise, perhaps we could mandate keeping a subset of the ABI stable for (at least) 2 minor releases and then allow changes when progressing to the next stable ABI version.
Also note that automatic build platforms are quite common nowadays. We still don’t have one for PyPI, but in the conda world, conda-forge is doing great and does away with having to upload tens of binaries for every release.
For wheels, we package authors can use e.g. Github to create automated builds. A platform similar to conda-forge for wheels would be nice to have, though, and also allow for more security when it comes to installing wheels from PyPI.
That’s not abi3, isn’t it? I’m pretty sure HPy has its own abi that is incompatible with abi3.
@markshannon Would it help you if we would to relax PEP 384 and introduce a new stable ABI? PEP 384 promise that abi3 will be supported throughout the entire life cycle of Python 3. The PEP is from 2009 and targets Python 3.2. Back then we did not anticipate that Python 3 will ever have two digit minor release.
We could consider to introduce a new stable ABI that would be valid for another 5 to 10 years.
I think that PEP 384 promise should remain. If the stable API is broken it surely mandates for a Python 4 (it may not be a big breakage such as Python 2->3, but it’s definitely not to be taken lightly either as I can definitely see lots of added burden on extension authors due to this change).
Particularly for my use case (pydevd/debugpy debugger), the debugger does an attach to process which will load symbols from the CPython DLL and call those as needed. If symbols from the Python DLL (stable ABI) change it would definitely break the debugger attach (for the debugger use case at least not all the needed functionality is available in the limited ABI) – note that even changes as reordering fields would break the debugger and it’d need to adapt specially (just recompiling is not an option in this case).
Note that I’m not particularly against the breakage on the stable ABI as long as it’s properly made visible and that core developers do provide a migration for the cases where the limited ABI isn’t enough at this point (I know a debugger is probably a special case and I’m OK adapting it between versions but I wouldn’t be OK in not having a way to do something that is available now).
My hope/dream is that we will sit down as a group and figure out what kind of C ABI/API we would want if we were to start from scratch (I know HPy has been held up as that API, but I don’t think we have made that as a statement instead of an idea). With that identified, we make that API available. Once we have that, we set a very long time frame (e.g. 5 years minimum, possibly a decade) for projects to migrate over to this ideal ABI/API and we provide as much help as possible in migrating. After that time frame, we drop the old ABI/APIs.
Whether we call the final result when we cull the old APIs Python 4 or not I personally don’t care, but it’s effectively what I’m suggesting we work towards.
I agree… the way that I’d probably go with it is probably having the API needed and then at each new release cut some part from the stable ABI and ask the community to upgrade accordingly.
The main work is definitely on the 3rd party maintainers side and managing expectations, but cutting up the API while providing alternatives at each new release and doing it in small parts vs having it in full and asking for a full migration may ease some of that pain…
As a note, given that the problem in the stable ABI is that it ossified things and makes it hard to transition, I don’t even see as bad leaving calling the stable ABI semi-stable ABI and just leave things that don’t need to be changed up until that’s not feasible anymore because some enhancement needed to change it (so, it’d still be stable until some feature/enhancement needs to actually change it and do a lighter breakage over time).
One problem I view is that a range of tools (such as debuggers and profilers) may not be able to use a wrapper API and need access to the actual symbols that are in the CPython dll with a known memory layout for the data it needs to access.
It’s the case of the attach to in a debugger without any previous work to make the attachment work or a profiler that’s done as an external tool (i.e.: GitHub - benfred/py-spy: Sampling profiler for Python programs) – maybe HPy fits the bill, but at the minimum the API symbols must be in the CPython dll. Doing it as a third party library doesn’t cut it in this case.
Having to build binary wheels for many platforms and Python versions is super-exhausting for package maintainers (and that’s even with tooling like cibuildwheel). I just spent a significant amount of time to port nanobind to the stable ABI so that I won’t ever have to do think about this again in the future. The API/ABI lackedfeaturesthat painstakingly had to be added for this to work.
The combination of limited API + stable ABI constitutes a promise made by the Python community. It says “If you put in extra work to implement your project with this smaller set of functions, we guarantee that it will run in future Python versions without recompilation.” It therefore makes me sad to see a discussion about intentionally breaking such an important promise here.
While it’s easy to say “I have grown weary of this agreement … let’s get rid of it!”, such a course of action will cause an untold amount of tedious work downstream, and it will break the contract of stability (i.e. the whole point of this API/ABI in the first place).
In response to @pf_moore: ABI3 has just become good enough for many projects. For my projects, Python 3.12 will be the first release where the limited ABI is finally complete enough so that I can make the switch and avoid the fragility of so having many builds. Now @markshannon wants to tear it all down. This is so frustrating!
I think that the very important middle ground here between “ABI change for every feature version” and “stable for eternity” (which I don’t think anyone ever seriously intended to promise, esp. at a time when a widespread assumption was that 4.0 would follow 3.9) is what @brettcannon mentioned: the stable ABI is stable for a long time (e.g. 5-10 additional years), but can still evolve eventually to abi4 or whatever it ends up being called.
Without taking sides, it’s probably also frustrating to try to evolve python (a very important task, I think we can agree - the alternative is stasis and eventual decay), when so much that ended up being covered by an old promise that keeps getting extended makes that nigh-impossible.