PEP 743: Add Py_COMPAT_API_VERSION to the Python C API

link: PEP 743 – Add Py_COMPAT_API_VERSION to the Python C API | peps.python.org

Abstract:

Add Py_COMPAT_API_VERSION and Py_COMPAT_API_VERSION_MAX macros to opt-in for planned incompatible C API changes in a C extension. Maintainers can decide when they make their C extension compatible and also decide which future Python version they want to be compatible with.

I’m waiting for your feedback :slight_smile:

I’m not sure about the “Examples of Py_COMPAT_API_VERSION usages” section. It’s not changes which are parts of the PEP, nor changes that I really want to make. They are more examples of changes which can be introduced more smoothly thanks to this PEP.

Victor

1 Like

This seems like an alternative to either testing with our alpha releases, or reading the deprecations section of what’s new.

Apart from being compiler-enforced one version earlier than an alpha release, what additional benefits does this provide?

It’s also worth noting that it is not easy to enable additional build variables in most backends. Generally it’s going to require modifying a setup.py or equivalent file, either to pick up a certain project-specific environment variable or to patch it for each special build. Do you really think projects will find this easy enough to be worth doing?

I like that with the _MAX variable, they don’t need to figure out which version to specify.

I find it really hard to understand how the new variable proposed by this PEP is supposed to be used by project maintainers. Combining Steve’s objection (that it is hard to set build-specific variables) with the use of the verb “test” in the PEP I conclude that the idea is that maintainers aren’t expected to set this variable in their project.

It appears that instead you promote a workflow where a maintainer who wants to test whether their extension is compatible with e.g. Python 3.13 (assuming they know it works with e.g. 3.11) can do a special build where they add -DPy_COMPAT_API_VERSION=0x030d0000 to their CFLAGS (or the equivalent on Windows) and look for errors (for APIs they are using that have been removed in 3.13) or warnings (for APIs they are using that are deprecated in 3.13)?

That appears very obscure – even more obscure than feature selection macros like Py_LIMITED_API already are. I also recommend updating the PEP to clarify this workflow (if that’s indeed what you’re proposing) – neither the Abstract nor the Motivation made me understand this, and your section listing “Examples” doesn’t actually show how an extension maintainer could use the new variable.

Reading the above discussion I’m not sure about the usage model: are these macros supposed to be enabled unconditionally (including in release builds), or to be enabled selectively on some CI builds?

Would also be worth pinging some high-profile users of the C API, such as @rgommers @scoder @da-woods

From the short announcement it was not clear to me what it was about, but after reading the PEP I got the idea. It’s like rewinding the clock a few years to see which time bombs have gone off.

However, it is still not clear to me what the value of Py_COMPAT_API_VERSION_MAX should be. Is it simply 0xffffffff? Or 0x03ffffff?

Is Py_COMPAT_API_VERSION set to PY_VERSION_HEX or left undefined by default?

The idea LGTM in general, I see how it can be useful.

Let’s take the example of PyImport_ImportModuleNoBlock() deprecated in Python 3.13 and scheduled for removal in Python 3.15.

Currently, you have to wait until the function is removed in Python 3.15, and then complain that nobody told you that PyImport_ImportModuleNoBlock() is going away!

The PEP offers the ability to opt-in right now for scheduled changes with DPy_COMPAT_API_VERSION: 2 years in advance before Python 3.15 alpha 1 release!

Aha, sorry, I should clarify that in the PEP. The suggested usage is to only request incompatible changes on a manual test if you have free time to deal with that, or to have a dedicated CI which can fail (not block the workflow).

Obviously, if you are eagger to be ready for scheduled changes before anyone else, you can set Py_COMPAT_API_VERSION to Py_COMPAT_API_VERSION_MAX and get your CI broken every time Python adds new scheduled changes. I don’t recommend that. At least, set it to a specific version.

If we set Py_COMPAT_API_VERSION_MAX manually to the “maximum version of scheduled changes” (ex: Python 3.15), we may miss scheduled changes if we forget to update Py_COMPAT_API_VERSION_MAX.

Maybe we can be lazy/reasonable and set Py_COMPAT_API_VERSION_MAX to Python 4.0? Setting it to 0xffffffff also works.

Extract of the PEP:

If the Py_COMPAT_API_VERSION macro is not set, it is to PY_VERSION_HEX by default.

Let me explain why I would want a macro like this:
To me, a substantial benefit of “new” C-API should be predictability: when reviewing code, you don’t need to look up every function in the manual to details like:

  • does it return borrowed references?
  • does it need the GIL?
  • does return -1 or 0 on errors?
  • does it steal references?
  • is NULL allowed for that argument?
  • does it type-check arguments for me?

But even if “new” functions are available, you can only get that benefit if the “old” functions are unavailable. So, I’d welcome a macro that I could define when starting a new project, which would hide API with known minor issues – while still making it API available for existing projects.

This PEP adds macro that can eventually serve that use case, but, IMO, the PEP’s rationale isn’t very clear. It seems to promote removing API that still works, but has some slight issue – which I don’t think we should do.

The mentioned PyImport_ImportModuleNoBlock is a prime example of a function that we don’t ever need to remove. IMO, the effort spent in deprecating it outweighs the effort it’d take to keep it – and that’s before counting the effort spent by users who need to go update working code just because we found a slightly better way of doing things.

Concrete example on how to set the Py_COMPAT_API_VERSION macro on the command line by passing CFLAGS flags to setuptools, build and pip.


First, let’s see how build and pip render compiler warnings.

Build an extension using PyImport_ImportModuleNoBlock() with python3.13 -m build (unfold to see logs):

python -m build
$ python3.13 -m build
* Creating venv isolated environment...
* Installing packages in isolated environment... (setuptools >= 40.8.0)
* Getting build dependencies for sdist...
running egg_info
creating test_pythoncapi_compat.egg-info
writing test_pythoncapi_compat.egg-info/PKG-INFO
writing dependency_links to test_pythoncapi_compat.egg-info/dependency_links.txt
writing top-level names to test_pythoncapi_compat.egg-info/top_level.txt
writing manifest file 'test_pythoncapi_compat.egg-info/SOURCES.txt'
reading manifest file 'test_pythoncapi_compat.egg-info/SOURCES.txt'
writing manifest file 'test_pythoncapi_compat.egg-info/SOURCES.txt'
* Building sdist...
running sdist
running egg_info
writing test_pythoncapi_compat.egg-info/PKG-INFO
writing dependency_links to test_pythoncapi_compat.egg-info/dependency_links.txt
writing top-level names to test_pythoncapi_compat.egg-info/top_level.txt
reading manifest file 'test_pythoncapi_compat.egg-info/SOURCES.txt'
writing manifest file 'test_pythoncapi_compat.egg-info/SOURCES.txt'
warning: sdist: standard file not found: should have one of README, README.rst, README.txt, README.md

running check
creating test_pythoncapi_compat-0.0.0
creating test_pythoncapi_compat-0.0.0/test_pythoncapi_compat.egg-info
copying files to test_pythoncapi_compat-0.0.0...
copying extension.c -> test_pythoncapi_compat-0.0.0
copying setup.py -> test_pythoncapi_compat-0.0.0
copying test_pythoncapi_compat.egg-info/PKG-INFO -> test_pythoncapi_compat-0.0.0/test_pythoncapi_compat.egg-info
copying test_pythoncapi_compat.egg-info/SOURCES.txt -> test_pythoncapi_compat-0.0.0/test_pythoncapi_compat.egg-info
copying test_pythoncapi_compat.egg-info/dependency_links.txt -> test_pythoncapi_compat-0.0.0/test_pythoncapi_compat.egg-info
copying test_pythoncapi_compat.egg-info/top_level.txt -> test_pythoncapi_compat-0.0.0/test_pythoncapi_compat.egg-info
copying test_pythoncapi_compat.egg-info/SOURCES.txt -> test_pythoncapi_compat-0.0.0/test_pythoncapi_compat.egg-info
Writing test_pythoncapi_compat-0.0.0/setup.cfg
Creating tar archive
removing 'test_pythoncapi_compat-0.0.0' (and everything under it)
* Building wheel from sdist
* Creating venv isolated environment...
* Installing packages in isolated environment... (setuptools >= 40.8.0)
* Getting build dependencies for wheel...
running egg_info
writing test_pythoncapi_compat.egg-info/PKG-INFO
writing dependency_links to test_pythoncapi_compat.egg-info/dependency_links.txt
writing top-level names to test_pythoncapi_compat.egg-info/top_level.txt
reading manifest file 'test_pythoncapi_compat.egg-info/SOURCES.txt'
writing manifest file 'test_pythoncapi_compat.egg-info/SOURCES.txt'
* Installing packages in isolated environment... (wheel)
* Building wheel...
running bdist_wheel
running build
running build_ext
building 'example_python_cext' extension
creating build
creating build/temp.linux-x86_64-cpython-313
gcc -fno-strict-overflow -Wsign-compare -DDYNAMIC_ANNOTATIONS_ENABLED=1 -DNDEBUG -fexceptions -fcf-protection -fexceptions -fcf-protection -fexceptions -fcf-protection -fPIC -I/tmp/build-env-y7i1zt8e/include -I/usr/include/python3.13 -c extension.c -o build/temp.linux-x86_64-cpython-313/extension.o
extension.c: In function ‘get_python_version’:
extension.c:19:5: warning: ‘PyImport_ImportModuleNoBlock’ is deprecated [-Wdeprecated-declarations]
   19 |     PyObject *mod = PyImport_ImportModuleNoBlock("sys");
      |     ^~~~~~~~
In file included from /usr/include/python3.13/Python.h:113,
                 from extension.c:1:
/usr/include/python3.13/import.h:54:44: note: declared here
   54 | Py_DEPRECATED(3.13) PyAPI_FUNC(PyObject *) PyImport_ImportModuleNoBlock(
      |                                            ^~~~~~~~~~~~~~~~~~~~~~~~~~~~
creating build/lib.linux-x86_64-cpython-313
gcc -shared build/temp.linux-x86_64-cpython-313/extension.o -L/usr/lib64 -o build/lib.linux-x86_64-cpython-313/example_python_cext.cpython-313-x86_64-linux-gnu.so
installing to build/bdist.linux-x86_64/wheel
running install
running install_lib
creating build/bdist.linux-x86_64
creating build/bdist.linux-x86_64/wheel
copying build/lib.linux-x86_64-cpython-313/example_python_cext.cpython-313-x86_64-linux-gnu.so -> build/bdist.linux-x86_64/wheel
running install_egg_info
running egg_info
writing test_pythoncapi_compat.egg-info/PKG-INFO
writing dependency_links to test_pythoncapi_compat.egg-info/dependency_links.txt
writing top-level names to test_pythoncapi_compat.egg-info/top_level.txt
reading manifest file 'test_pythoncapi_compat.egg-info/SOURCES.txt'
writing manifest file 'test_pythoncapi_compat.egg-info/SOURCES.txt'
Copying test_pythoncapi_compat.egg-info to build/bdist.linux-x86_64/wheel/test_pythoncapi_compat-0.0.0-py3.13.egg-info
running install_scripts
creating build/bdist.linux-x86_64/wheel/test_pythoncapi_compat-0.0.0.dist-info/WHEEL
creating '/home/vstinner/myprojects/example_python_cext/dist/.tmp-j2nj1ll4/test_pythoncapi_compat-0.0.0-cp313-cp313-linux_x86_64.whl' and adding 'build/bdist.linux-x86_64/wheel' to it
adding 'example_python_cext.cpython-313-x86_64-linux-gnu.so'
adding 'test_pythoncapi_compat-0.0.0.dist-info/METADATA'
adding 'test_pythoncapi_compat-0.0.0.dist-info/WHEEL'
adding 'test_pythoncapi_compat-0.0.0.dist-info/top_level.txt'
adding 'test_pythoncapi_compat-0.0.0.dist-info/RECORD'
removing build/bdist.linux-x86_64/wheel
Successfully built test_pythoncapi_compat-0.0.0.tar.gz and test_pythoncapi_compat-0.0.0-cp313-cp313-linux_x86_64.whl

Did you spot the warning: ‘PyImport_ImportModuleNoBlock’ is deprecated [-Wdeprecated-declarations] warning in these 89 lines of logs? I hope so, since I asked you to look for compiler warnings :slight_smile:

But honestly, it’s easy to miss such warning if you don’t pay attention.

Now if I build and install the same extension with pip (unfold to see logs):

pip install
$ python3.13 -m pip install .
Defaulting to user installation because normal site-packages is not writeable
Processing /home/vstinner/myprojects/example_python_cext
  Installing build dependencies ... done
  Getting requirements to build wheel ... done
  Preparing metadata (pyproject.toml) ... done
Building wheels for collected packages: example_python_cext
  Building wheel for example_python_cext (pyproject.toml) ... done
  Created wheel for example_python_cext: filename=example_python_cext-0.0.0-cp313-cp313-linux_x86_64.whl size=4037 sha256=0b7abb2627cc515dd7bd31ab7b1832f41f1282030a371b42b21c0d7f2c9984f8
  Stored in directory: /tmp/pip-ephem-wheel-cache-5udgoufe/wheels/79/8d/7b/7a75feee85b68ebcdbf357b59ce0b9ce7e5a78ea43b0a7f6bf
Successfully built example_python_cext
Installing collected packages: example_python_cext
Successfully installed example_python_cext-0.0.0

Did you spot the compiler warning this time? Nope, it’s hidden: compiler logs are not written.


Ok, now let’s try again with PEP 743. I set the Py_COMPAT_API_VERSION macro to Py_COMPAT_API_VERSION_MAX.

python -m build logs:

$ CC=clang CFLAGS="-DPy_COMPAT_API_VERSION=Py_COMPAT_API_VERSION_MAX" ../env/bin/python -m build
(...)
extension.c:19:21: error: call to undeclared function 'PyImport_ImportModuleNoBlock'; ISO C99 and later do not support implicit function declarations [-Wimplicit-function-declaration]
   19 |     PyObject *mod = PyImport_ImportModuleNoBlock("sys");
      |                     ^
extension.c:19:15: error: incompatible integer to pointer conversion initializing 'PyObject *' (aka 'struct _object *') with an expression of type 'int' [-Wint-conversion]
   19 |     PyObject *mod = PyImport_ImportModuleNoBlock("sys");
      |               ^     ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
2 errors generated.
error: command '/usr/bin/clang' failed with exit code 1

ERROR Backend subprocess exited when trying to invoke build_wheel

Ok, this time, the build fails since the function is no longer defined.

python -m pip install logs:

$ CC=clang CFLAGS="-DPy_COMPAT_API_VERSION=Py_COMPAT_API_VERSION_MAX" ../env/bin/python -m pip install .
(...)
Building wheels for collected packages: example_python_cext
  Building wheel for example_python_cext (pyproject.toml) ... error
  error: subprocess-exited-with-error
  
  × Building wheel for example_python_cext (pyproject.toml) did not run successfully.
  │ exit code: 1
  ╰─> [15 lines of output]
      running bdist_wheel
      running build
      running build_ext
      building 'example_python_cext' extension
      creating build
      creating build/temp.linux-x86_64-cpython-313-pydebug
      clang -fno-strict-overflow -Wsign-compare -fno-omit-frame-pointer -mno-omit-leaf-frame-pointer -g -Og -Wall -DPy_COMPAT_API_VERSION=Py_COMPAT_API_VERSION_MAX -fPIC -I/home/vstinner/myprojects/env/include -I/home/vstinner/python/main/Include -I/home/vstinner/python/main -c extension.c -o build/temp.linux-x86_64-cpython-313-pydebug/extension.o
      extension.c:19:21: error: call to undeclared function 'PyImport_ImportModuleNoBlock'; ISO C99 and later do not support implicit function declarations [-Wimplicit-function-declaration]
         19 |     PyObject *mod = PyImport_ImportModuleNoBlock("sys");
            |                     ^
      extension.c:19:15: error: incompatible integer to pointer conversion initializing 'PyObject *' (aka 'struct _object *') with an expression of type 'int' [-Wint-conversion]
         19 |     PyObject *mod = PyImport_ImportModuleNoBlock("sys");
            |               ^     ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
      2 errors generated.
      error: command '/usr/bin/clang' failed with exit code 1
      [end of output]
  
  note: This error originates from a subprocess, and is likely not a problem with pip.
  ERROR: Failed building wheel for example_python_cext
Failed to build example_python_cext
ERROR: Could not build wheels for example_python_cext, which is required to install pyproject.toml-based projects

Same with pip, the build fails as expected.

I cheated by using clang instead of gcc. Apparently, GCC 13 stills treats -Wimplicit-function-declaration as a warning, but GCC 14 will treat it as an error to prepare the ecosystem for C23. clang is used by default on macOS and FreeBSD. Or you can simply treat this warning as an error.

Later, we can consider adding options to build and pip to set the Py_COMPAT_API_VERSION macro for us and pass the right compiler flags to treat -Wimplicit-function-declaration as error.

1 Like

Would it be better to implement this with a #define in the source file, similar to how we did PY_SSIZE_T_CLEAN? I’d be okay with a Py_CLEAN_API subset - that feels far more like Py_LIMITED_API but with a different intent/subset.

Including a version number in there doesn’t seem to serve any value - the choices realistically are “the version of the headers being used” or “abi3”, and so it seems like what we want here is a per-source unit way to opt into “current version without already-replaced APIs”?

I suppose that would be everytime we try to support a new feature version of Python? Breaking CI once a year is probably not a deal-breaker for an active project.

More annoying would be the fact that the error messages from the compiler would not give a clue as to how to fix the issue - they would just mention an unknown function.

I can see something like this might be useful in advance testing.

I don’t have a hugely strong opinion - it’s isn’t usually the long-termed planned deprecations that cause Cython issues (because we’ve typically started using the new APIs ahead of time).

You can start a project on Python 3.13, and then want to compile for 3.14 without having to update all the code.

1 Like

What’s the use case for this?

Maintainers can already test if they use deprecated APIs (at last for most compilers), including ensuring that the build fails when using deprecated APIs. Setting Py_COMPAT_API_VERSION outside of CI or development seems risky to me, as this introduces the risk of breakage when users install a package with a newer Python version (similar to how -Werror in CFLAGS can cause issues).

The pep mentions the following as one of the example uses:

Change the behavior of a function or a macro. For example, calling PyObject_SetAttr(obj, name, NULL) can fail, to enforce the usage of the PyObject_DelAttr() function instead to delete an attribute.

How would this work in practice? Changing the behaviour of PyObject_SetAttr can only be done by introducing a new API and aliasing PyObject_SetAttr to that when Py_COMPAT_API_VERSION is set to an appropriate value.

BTW. This proposal reminds me a little of the availability system in Apple’s system headers. Sadly that feature uses clang function attributes that are very specific to Apple’s use case and cannot be extended.

Do you mean “build for 3.13 excluding functions to be removed in 3.14 but including functions to be removed in 3.15”?

If you’re compiling for 3.14, you need the 3.14 headers. I’m not sure who needs to target only one release worth of deprecations but not two, but it still sounds like a scenario better served /D Py_NOT_QUITE_AS_CLEAN_API=1 rather than having to figure out the version value.

Or perhaps it means we need a concept of “pending deprecation” again?

Deprecated API marked with Py_DEPRECATED() is the simple case.

For example, there are private functions which are not marked with deprecation (that might change in the future).

It becomes more complicated when changes cannot be marked with a deprecation, like PEP 674: disallow using a macro as an l-value (disallow Py_TYPE(obj) = new_type;).

Or when removing a standard header file #include which pulls many “standard” functions. How do you put a compiler warning on that?

As Petr wrote, the problem is also to opt-out from “legacy functions” which are not deprecated, maybe only soft deprecated. Like functions returning borrowed references. There is no plan to get rid of them soon, but using them is a risk, especially when you consider the future of Python with Free Threading.

Usually, a CI should be strict, whereas the release (tarball or whatever) should be non-strict. For example, a CI would use Py_COMPAT_API_VERSION and/or -Werror, but a release should not. You can have a dedicated CI for Py_COMPAT_API_VERSION which doesn’t hold the workflow when it fails (“not mandatory” / “optional” CI).

The PySlice_GetIndicesEx() API has a strange history:

Py_DEPRECATED(3.7)
PyAPI_FUNC(int) PySlice_GetIndicesEx(PyObject *r, Py_ssize_t length,
                                     Py_ssize_t *start, Py_ssize_t *stop,
                                     Py_ssize_t *step,
                                     Py_ssize_t *slicelength);

#if !defined(Py_LIMITED_API) || (Py_LIMITED_API+0 >= 0x03050400 && Py_LIMITED_API+0 < 0x03060000) || Py_LIMITED_API+0 >= 0x030
#define PySlice_GetIndicesEx(slice, length, start, stop, step, slicelen) (  \
    PySlice_Unpack((slice), (start), (stop), (step)) < 0 ?                  \
    ((*(slicelen) = 0), -1) :                                               \
    ((*(slicelen) = PySlice_AdjustIndices((length), (start), (stop), *(step))), \
     0))

Depending on the API version, you get one implementation or another. You can imagine something similar with Py_COMPAT_API_VERSION. I’m not sure if PyObject_SetAttr() is the best example, since I’m not sure that we want to change PyObject_SetAttr().

Didn’t we change it recently to reject NULL instead of it meaning to delete the attribute? I know we discussed it, so it’s probably one of the best current examples we have.

Me? I mean “build for 3.14, excluding functions that were marked problematic (and had better alternatives) in 3.13”. I don’t think we should remove functions – or mark them for removal – if they still work.

Yeah, Py_COMPAT_API_VERSION is also a solution for that: let people move to recommended functions without having to introduce a “hard” deprecation.

For example, we may want to soft deprecate PyDict_GetItem(), recommend PyDict_GetItemRef(), and decide that Py_COMPAT_API_VERSION=Py_COMPAT_API_VERSION_MAX removes the function: using it fails with a build error.

So this still feels better suited for another API set more like a not-quite-as-limited-API rather than anything to do with version numbers.

I’m not totally convinced that either option is more helpful than just documenting “not recommended - use Py_Alternative instead”, but if it is, an ifdef check would be far easier to reason about than version comparisons.

Presumably functions in 3.13 that have better alternatives are still functions in 3.14 that have better alternatives, and so the version isn’t as important as the fact that you want builds to fail if they’re using functions that have better alternatives.

Thanks for the ping @pitrou. I like the idea of opting into advance warnings by turning C API deprecations into build errors. Many projects do this for Python API usage, and it helps with either proactively fixing things in a project to be future-proof or to be in time to object to a deprecation. For C API, this new macro will be a bit more granular than using -Werror, which isn’t always feasible to enable in CI of large projects. We’d likely start using this in NumPy/SciPy in a CI job.

This is also interesting, but seems conceptually separate. Perhaps it requires two macros:

  1. Hide API surface that is scheduled for future removal
  2. Hide non-recommended API surface

Where (2) is a superset of (1). I think for both of those, it is safe to enable them in releases of downstream packages when set to a fixed, already-released CPython C API version. And using them without a fixed version should only be done in CI or manual testing.

If (1) and (2) were combined into a single macro, it still seems useful and better than not having anything like this. The difference is in “work caused” - (1) requires action sooner or later, while (2) is a nice-to-have but safe to ignore in existing projects.

In NumPy we have a similar mechanism (NPY_NO_DEPRECATED_API, see docs here), and the ability to set fixed version numbers is important in my experience.

3 Likes