Python 3.13 alpha 1 contains breaking changes, what's the plan?

Hi all,

I haven’t followed CPython’s core discussions much since they (sadly) moved away from python-dev, so I ran into the C-API changes that were made for 3.13a1 with little prior warning. At least the shear amount of changes leaves me baffled.

Hundreds of functions were removed, renamed, replaced with different functions. Header files were removed. Names were removed from header files. Macros were changed, even a few publicly documented ones. And the default answer to “how do I replace this usage” seems to be “we’ll try to design a blessed replacement in time”.

Victor Stinner mentioned that he’d try to make as many existing packages work again as he can, but then, what about the PyPI packages that are not visible enough to be helped? What about the packages that used to work but are no longer actively maintained? What about the repos that simply won’t work up a new release even if you send them a PR? What about the packages that cannot be tested with 3.13 for weeks, months or years, and that won’t report back issues to CPython, because they depend on other packages that fail to build? What about the cases where python devs disagree on a “good” public API, or where they disagree with users regarding the usability of a replacement?

I remember a story from a bank where I was working, where we found a piece of code that no-one understood and that looked unused. So we gave a developer the task to investigate. What he ended up doing was to comment out the code, adding the note “let’s see if someone will notice”. Well, no-one noticed it, until it was released and brought our production server down.

The situation that we ended up with in 3.13a1 seems similarly disruptive as the move from Python 2 to 3, although the pile of C-API changes back then was way, way smaller. The amount of breakage is so huge that I’m personally giving up on fixing it. I’d rather start using the internal APIs in Cython than try to make things work efficiently with the public ones again. I always argued against that approach, but I think this Python 4 moment is the time to seriously consider it. What would help from the side of CPython core would be to keep also the internal header files in C89/C99 style and avoid breakage in them during point releases. That’s probably not too much to ask. But I think CPython core should also be honest and openly call this release Python 4.



Each release in the V3 series makes small changes to the C API that need small changes in extensions modules.

As a extension author I assume the changes in 3.13 will be more of the same.
I maintain PyCXX and PySVN.

Fedora folks have started to build against 3.13 and I have some build fails to work on soon from that work. That’s just business as usual, 3.12 also required changes.


No, this is not business as usual. See C API: My plan to clarify private vs public functions in Python 3.13

There are many build failures. Compiling a list is hard, because we can’t even attempt building some packages until their dependencies (like Cython) are working.

I wish such large-scale changes went through the PEP process.


Because the changes remove the private functions from the Python library (they are no longer being exported), you can only do this by copying the internal API’s sources verbatim into Cython generated C code (either directly or via some #include).

Victor has a point in claiming that private Python C API should indeed be private, but his approach feels too radical.

At the very least, a PEP should have been written and approved by the Steering Council prior to committing the changes, so that the Python community of extension writers becomes aware of the intended change and gets a chance to participate in the discussion.



I just created C API: Meta issue to replace removed functions with new clean public functions to collaborate on this topic.

I plan to dedicate a large part of my time next months to add public C API functions and help projects to become compatible with Python 3.13. It’s an active project in my team to “port Fedora” for Python 3.13, the work just started last week (after alpha1 release). Cython is likely the first target, since it’s a very common dependency, and it is a good metric to see how things go with C API changes.

The internal C API usage and guidelines is not well defined so far. Exported functions should be usable outside Python, but there is no backward compatibility warranty.

I think that what you are describing is closer to PEP 689 – Unstable C API tier: unstable but supported API. For example, PyLongObject changed a lot in Python 3.12. PyUnstable_Long_IsCompact() and PyUnstable_Long_CompactValue() functions were added to the unstable API: you can use them, they are guaranteed to not change during Python 3.12.x lifecycle but they may change or be removed in Python 3.13.0 final release (in any 3.x.0 release): see the PEP for details.

We take more freedom regarding to “C99 style” in the internal C API. For example, we don’t fully support C++ in the internal API. Well, it’s a best effort support, if we can, we support C++, but not if it requires too much work.

Python is trying to support “explicitly” Cython for a long time: see for example [C API] Add explicit support for Cython to the C API · Issue #89410 · python/cpython · GitHub. The problem is that it’s a lot of work, and it seems like so far, nobody managed to provide a clean API for every single private/internal API usage made by Cython. It was more done on a case by case basis. Cython is different than other C extensions since it wants to be as fast or faster than CPython and for that “abuse” a lot of CPython internals. By “abuse”, I mean that Cython code is fragile and CPython cannot guarantee that it will not break. Adding APIs to abstract “a little bit” the relationship between CPython and Cython may reduce friction. Adding public documented and tested APIs is part of this work.

My plan is to add functions to so it’s possible to write code compatible with old and new Python versions with a single code base. The most annoying part of Python 3 was that it was really hard to have a single code base working with Python 2 and Python 3.

There is also tool which adds support for new Python versions, without losing support for old Python versions. I didn’t update the tool recently, it only handles some Python 3.12 changes.


We can consider exporting some internals functions for the needs of Cython. But I would prefer to do that on case by case basis, and consider other more “future-proof” options, like adding a public C API. Sticking with the internal C API is not making Cython update easier. For some use cases, there might be a way to keep good performance and have a clean public API.

1 Like

I also elaborated my plan for Python 3.13 and future versions, limited C API, stable ABI, clarify Private vs Public vs Internal, etc., in a talk that I have at the Python core dev sprint at Brno 3 weeks ago: slides (PDF). My goal is to make Cython compatible with new Python versions as soon as possible. Well, Cython and all other C extensions. The ideal goal would be to have all C extensions ready at day 1 since alpha1 release, and that’s possible if we migrate most C extensions to the stable ABI. But there are still some technical challenges that we have to address.

The unstable tier is quite new. I imagined that after we added it, we’d review the existing underscored functions, and for each one decide whether to put it in the unstable tier or hide it.
I’ve pushed back against hiding them all, but in the end, that’s the decision that was made.

1 Like

I have been following all the C-API related changes closely.
I welcome the work @vstinner is doing.
The free-threading is just putting added urgency on that work.

For me this is “business as usual”.


There will never be a Python 4.


I guess we need a PEP 4404!


Apparently, Python 3.13 dev cycle looks like a good opportunity to do this work.

The real question is whether there will be a Python 24 [1] :smiley:

  1. as in Python 2024, 2025, 2026 … ↩︎


Or 42!

@barry you were missed at the core sprint. Glad to hear you are doing well (heard via Guido). :smiley:


Thanks @willingc !


Now you mention it, @ambv considered including some form of Calendar Versioning as part of PEP 602 – Annual Release Cycle for Python, but left it out to keep the PEP focussed.

Because of the annual release cycle, we kind of have calendar versioning:

  • 3.11.0 was released in 2022
  • 3.12.0 was released in 2023.
  • 3.13.0 will be released in 2024.

So this is 3.x.0 where x = YY - 11.

This could be simplified by skipping some and using 3.YY.0:

  • 3.26.0 will be released in 2026
  • 3.27.0 will be released in 2027
  • 3.28.0 will be released in 2028

Or as YY.0:

  • 26.0 will be released in 2026
  • 27.0 will be released in 2027
  • 28.0 will be released in 2028

Some benefits:

  • Every feature release can currently have breaking changes, but there’s sometimes a misconception that Python follows Semantic Versioning and breaking changes are only allowed in 4.0. Now every feature release would come with a major bump.

  • We get to take a nice big leap over all that 4.0 baggage.

We’d of course have to wait until 3.14 is out in 2025 so we can make π jokes :slight_smile:


I know, I was a fan of that at the time!

One more “Pie Pie” to throw in the mix!


2025 will be a fun year for sure:
Python 3.14 (3.14.1 will exist for sure too, hopefully 3.14.15 and beyond too!)
2025 is a square year - 45^2 (the first one in most of our lives! The last one was 1936 and the next one is 2116!)


The second minor release of 3.14.1 should be named 3.14.15, and the third 3.14.159, and so on.