C API: My plan to clarify private vs public functions in Python 3.13


In issue #106320, I removed many private C API functions in Python 3.13. It does break many C extensions projects on purpose. Here is my plan for the Python 3.13 development cycle. In short, it should be mostly fine before Python 3.13 beta1, otherwise we just follow the backup plan (revert all removals).

Phase 1: now, before alpha 1

  • Remove as many private API (functions, variables, macros) as possible, as early as possible: in practice, it only means to move these API to the internal C API. Not really “remove” them, only “remove” them from the public C API.
  • Identify affected C extensions: code search (I already started to do that), build projects with Python 3.13 (I just tried to build numpy: I created an issue for Cython).
  • At least report the issue to affected projects, or even propose a fix.
  • Consider adding a new clean (tested, documented, maybe better API) functions replacing removed private functions.

Phase 2: Python 3.13 alpha 1 (October 2023)

  • Target: Python 3.13 alpha 1 (October 2023)
  • For most popular projects like Cython, numpy and pip: revert removals to unblock these projects if no release can be shipped with a fix in the meanwhile. If possible, I would prefer to only revert removals once these projects are fixed upstream (got a fix merged). The alpha1 should be usable a minimum!

Phase 3: Python 3.13 beta 1 (May 2024)

  • Target: Python 3.13 beta 1 (May 2024), but honestly doing this way before will be better. It’s just that I don’t know how much I can do alone so I prefer being pessimistic :slight_smile:
  • For remaining broken C extensions: decide what to do with removals causing most troubles. Propose a tool to automate migration to the new API? Add new public API to replace the removed one? Revert removals to give more time to C extensions maintainers? Something else?

Why? No, really, why, Victor?

Well, maybe I want your life to be a pain :rofl:

No, seriously: there is an ongoing work on fixing C API issues (new C API working group!). We would like to fix issues one by one, but it will take time since we want to reduce the number of impacted projects (smooth migration). If the C API is too broad, it’s hard to do this work.

Private API has multiple issues: they are usually not documented, not tested, and so their behavior may change without any warning or anything. Also, they can be removed anytime without any notice. Core developers don’t know how these functions are used, and so it’s hard to enhance them and maintain them.

My idea is to proactively go through the long list of private functions: see how they are used, see if anyone uses it, clarify the difference between private and public functions. My long goal would be to only provide clean public functions (tested, documented, follow regular PEP 387 deprecation process) in the public API, and put anything else in the internal C API. An alternative is to put some functions in the new PyUnstable C API, PEP 689 – Unstable C API tier.

As I wrote, I would like to convert some private functions to well tested and well documented public functions. IMO this work is important. During this conversion, we have to follow the latest C API guidelines to provide an even better API than before.

To convert a project to a new better public API, this project cannot lose support for old Python versions! That’s the problem of the idea of starting a new C API and leave the current C API unchanged: new API are only supported by new Python version. The solution here is to provide the new functions on old Python versions: that’s exactly what the pythoncapi-compat project does, header file providing these functions as static inline functions for old Python versions, up to Python 2.7!

In previous Python releases, I already followed a similar migration plan for deprecated functions: remove all deprecated functions planned for removal as early as possible at the beginning of a new development cycle, and then slowly reconsider reverting some removals once affected projects are aware of the issue (report the issue, or better fix the issue, but maybe there is no release yet), to unblock these projects.

The plan is to get a beta1 release as functional as possible, but allow some bumpy road between now and beta1.

In the worst case, the backup plan is simple: revert removals. In fact, I didn’t remove the function implementation. I only moved their definition from the public C API to the internal C API (move from Include/ or Include/cpython/ to Include/internal/). A revert is just to do the opposite again.

Obviously, anyone is welcomed to help me in this work: identify affected projects, report issues to these projects, propose patches, propose converting private functions to public functions, etc.

So. What do you think of this plan? Is anyone interested to help me? :slight_smile:

See also


Well, my counterproposal: deprecate them (!), but keep them working for now. Only remove those that actually are a burden to maintain, or block development.
It’ll save work for you, and (more importantly) for users of the API.
See also @thomas post here: C API: What should the leading underscore (`_Py`) mean? - #11 by thomas

You cannot identify all affected C extensions. A PyPI search is missing Conda packages, Linux distros, private codebases, &c. A search for popular packages is miscounting people who are responsible enough to set up a PyPI mirror.
Not all of the maintainers are interested in fixing their legacy API, or reaching out to you on the issue tracker, right now this year.
Also, in my experience it’s not easy to convince you to revert a removal.

It’s good to convert useful private API to public/unstable. But it can be done without massive breakage of legacy code.

IMO, pythoncapi-compat is great for providing new API for old Python versions. But if we want to provide old API for new Python versions, why not do that in CPython itself? After all, pythoncapi-compat seems to be for functions that can be implemented easily; keeping a deprecated function in CPython instead doesn’t sound like a big burden.


I cannot speak for other sources, but conda packages (and, I suppose, Linux distros) fetch their source code from somewhere else: either directly from a VCS (usually Github, Gitlab etc.) or from a PyPI source dist. Also, my experience is that many (most?) Python modules for which conda packages are provided also have an established presence on PyPI.

So I wouldn’t be too worried about Conda packages of Python modules without neither a PyPI nor a Github presence, as they must be rather uncommon.


FWIW, as a heavy Conda user, maintainer of a number of Conda-Forge packages and very occasional contributor to Conda itself, a large majority of packages are sourced directly from the PyPI sdists, and it is very uncommon to find packages that are on CF but not PyPI in some form (since its close to a superset of the work required, special cases excepted)—I don’t think I could name a single one off the top of my head. And they certainly have to be either on PyPI or GItHub, GitLab, etc. so the recipe has somewhere authoritative to retrieve the source artifact from.

If a particular package tends to be much more heavily used by CF vs. PyPI users, it could skew the top N numbers somewhat, but its not a huge effect overall.

@encukou On the distro side, as Python package maintainer for a popular Linux distro, maybe you could provide some at least rough numbers on how many of your distro packages that use Python C extensions are not available on PyPI, or major code hosting platforms like GitHub and GitLab?


Why deprecate something that’s already private? Projects reaching for those functions know that private functions have no stability guarantees (and in the case of Cython for example, the project explicitly prefers dealing with that instability over any speed loss) – I don’t see why their choice to accept that risk should turn into an obligation for CPython.


Very very rough estimate of packages not on PyPI: 5% (250/5000).
I don’t know how to count hosting platforms.

But rather than about being available on PyPI, I worry about about judging popularity/relevance (PyPi download stats seem heavily skewed toward environments that download directly from PyPI).
And also about ignoring whole kinds of software, like GUI apps (Blender, Kodi) or system library bindings (RPM, Samba), to name ones I know about.

Because we changed our messaging around this quite recently.
We still have underscored API in docs; some might be widely used and/or added by PEPs that imply it was intended for external use.

No, not all projects that reached out for those functions in the last few decades know the current guarantees.

Underscore-prefixed API isn’t covered by PEP 387. We can change/remove it. But that doesn’t mean we shouldn’t be good stewards of it.


Progress report.

  • The private functions _PyLong_AsInt() and _Py_IsFinalizing() have been promoted as public functions, congrats to them!
    • PyLong_AsInt(): doc – I really like this one :slight_smile: It’s commonly used in the Python code base!
    • Py_IsFinalizing(): doc - C flavor of sys.is_finalizing().
  • I’m keeping pythoncapi-compat up to date with Python 3.13 which added 12 C API functions. You can now enjoy the new cool APIs like PyDict_GetItemRef() and PyModule_Add() on all Python versions! (I’m still trying to support Python 2.7 unofficially.)
  • I added PyObject_Vectorcall(), PyVectorcall_NARGS() and PY_VECTORCALL_ARGUMENTS_OFFSET to pythoncapi-compat (API added to Python 3.9). The implementation is not efficient (to keep it short), but it should help to have a single code working on all Python versions.
  • I removed around 272 private functions and 15 private variables from the public C API (compared to Python 3.12 API).
  • A few internal functions are no longer exported to disallow using them outside CPython internals.

See also related discussion: Use the limited C API for some of our stdlib C extensions.

I declare that the Python 3.13 season of “removing as many private C API as possible” ended! I stop here until Python 3.14. I closed the Remove private C API functions (move them to the internal C API) issue.

I will now focus on testing as many C extensions as possible on Python 3.13, see which functions are missing in the public C API and consider making them public (add doc, tests, replace _Py prefix with Py).

And as I wrote, I consider moving back some _Py private functions from the internal C API to the public C API (from Include/internal/ to Include/cpython/) if I don’t have enough time to fix enough C extensions. Especially functions which affect most C extensions. The exact list of affected C extensions and affected functions has to be created.

I plan to continue the work in new issues and only keep this issue to point to new issues.

In the current main branch (Python 3.13), there are 86 private functions (in Include/cpython/) exported with PyAPI_FUNC(). IMO 86 private functions is way better than the 385 private functions exported by Python 3.12. It’s easier to manage.

Some of them can be moved to the internal C API, but the remaining ones are the most complicated to move for various reasons.

For example, in 2020 I failed to remove _Py_NewReference() since it’s used by 3rd party C extensions (and by Python itself, by the way) to implement nice free list optimizations. There is no good replacement for that, and designing a public API to implement a free list is not trivial. Such API stays in the gray area: it should not be used, but I will not blame you if you continue using it.

I didn’t count internal functions (Include/internal/), I don’t care about these ones. See the complete statistics of the C API.

Using the internal C API is fine, but in exchange you are on your own. There is no documentation, usually the API is not tested, no or incomplete error checking, etc. Most of the internal C API remains usable outside CPython itself because they are usages for that, like debuggers and profilers which need to inspect Python internals without modifying its state. Usually, it means reading memory without calling functions (if possible).


Python 3.13 removed deprecated functions to configure Python initialization, such as Py_SetPath(), and there is no replacement API for the limited C API: the PyConfig API (PEP 587) is excluded from the stable ABI because of the big PyConfig structure which is has no ABI guarantees.

A new API is being designed/discussed in the FR: Allow private runtime config to enable extending without breaking the `PyConfig` ABI topic and my PyInitConfig API PR. A PEP may be written for that.

The work is now on-going in two directions to mitigate the cost of upgrading C extensions to Python 3.13:

See these two issues for details on the on-going work.

1 Like


  • Add new public functions: PyList_Extend(), PyList_Clear(), PyDict_Pop() and PyDict_PopString().
  • Revert 50 private C API removals and restore 2 includes (<ctype.h> and <unistd.h>) in Python.h.

Details: Is Python 3.13 going to be Python 4? - #29 by vstinner

This work is scattered in many issues and discussions. I wrote an article which lists the most important ones: Remove private C API functions.


Update: The C API Working Group approved the PyTime API:

typedef int64_t PyTime_t;
#define PyTime_MIN INT64_MIN
#define PyTime_MAX INT64_MAX

PyAPI_FUNC(double) PyTime_AsSecondsDouble(PyTime_t t);
PyAPI_FUNC(int) PyTime_Monotonic(PyTime_t *result);
PyAPI_FUNC(int) PyTime_PerfCounter(PyTime_t *result);
PyAPI_FUNC(int) PyTime_Time(PyTime_t *result);

The private PyTime API of Python 3.12 is used by Cython.


Update: Python 3.13 gets some more public C APIs:

  • New PyCFunctionFast and PyCFunctionFastWithKeywords types (commit) used by Cython and PyO3, replacing private types _PyCFunctionFast and _PyCFunctionFastWithKeywords (kept for backward compatibility). The private types were documented!
  • New PyErr_FormatUnraisable() (PR) which replaces private _PyErr_WriteUnraisableMsg(): better API giving more freedom on the error message.
  • New PyLong_FromNativeBytes() and PyLong_AsNativeBytes() functions (PR), replace private _PyLong_FromByteArray() and _PyLong_FromByteArray() functions.

See also my TODO list.