Requiring compilers' C11 standard mode to build CPython

For Python 3.11, we decided that CPython uses the C11 standard (see PEP 7). As far as we know, all relevant compilers support C11.
However, I found out that our CI isn’t all set up to use C11, so we can’t actually use all the features yet.

So, in practice, the features we can use are limited by how the CI & buildbots are set up. I don’t think that’s a terrible situation. It’s vaguely defined, but for platforms/configs that don’t have a stable buildbot, build with C11 and you’re safe.
But maybe we want to change it, so we don’t need to work around things that have been standardized for a decade?

Specifically, I wanted to use `alignof(max_align_t)`, but `alignof` and `max_align_t` is not available on all CIs runs and buildbots. I worked around this.

I asked autotools for a ALIGNOF_MAX_ALIGN_T, add to the list of hardcoded ALIGNOF_* values for Windows, add and test a fallback to alignof(long double) if max_align_t isn’t available. Instead of alignof I used _Alignof – compilers support the actual functionality and their C11 mode only adds the convenience macro (which is backwards incompatible).
If you think this particular case can be done better, comment at #103509 or open a new issue.

Opening up Python in Microsoft Visual Studio (the tool that has a GUI for the XMLs in PCBuild/ – not VS Code), I can see that we use “Legacy MSVC” rather than “ISO C11” (see a screenshot in the docs). Do we want to change that?

There are also *nix buildbots that would need changing, but I haven’t looked into how to do it.


Not all of our settings flow through into VS properly (VS reads those settings out of project files directly, while MSBuild allows shared properties files to be “included” and we make use of this), but I believe in this case it’s correct.

The main thing that we’d be enforcing is a minimum compiler version. Last time we pushed the minimum up, core devs complained that we broke their setup and they didn’t want to upgrade their compilers. (I don’t recall who, it’s probably on a bug somewhere.)

I don’t have any other concerns with enabling the compiler mode, except we must require that our header files do not require C11 and must also work in C++.


I don’t see why not, at least for 3.12+. I’m not sure if we’d want to change 3.11 at this point in its lifecycle.

What kind of changes to which builders? For the most part, any changes would need to be done by their owners unless we’re just talking about things like arguments to ./configure.

1 Like

We do require C++ compatibility for headers. AFAIK there’s even a test for it, but it’s pretty limited (and the distutils removal didn’t help).

Not requiring C11 for headers is another thing. What should we require? C99? Or C89 with select C99 features, as in 3.10?
We should document that in PEP 7 and test it, otherwise someone’s bound to break it. I guess the current CI and buildbot settings are keeping us in line somewhat, but that’s not ideal.

Yes, definitely.You did mean 3.13+, right?

I’ll not look into that until/unless we decide it’s a good idea.

I think we have to require C89 with very very few select C99 features, unless we also start requiring a particular C++ standard as well. There’s very little overlap or consistency between things added post-1990 to each language.

It shouldn’t be an onerous requirement, unless people want to get way too clever with macros/inline functions. And I think we’re best to not get way too clever with those. :wink:

The problem here is that you can’t really test this. A guideline without tests isn’t very useful.

We could be compatible with both C99 and C++. That’s testable, separately. And so is C11 and C++ (which would IMO be a better choice, since C11 removed the C99 features MSVC doesn’t support.)


It’s useful enough for winning arguments on the bug tracker, even if we don’t discover the issue until later. I’ve had to argue the point before.

Appeal to Authority (in this case, our guideline) often wins where Appeal on-behalf-of-actual-users just extends the argument.

Sure, and yeah C11 is probably easier. But there’s no reason why our C users should be pushed into a later compiler version just to include our headers either.

Basically, C89 + // comments is as far as I’d go. And I could be convinced that we don’t need // comments in our headers either. Designated initializers are a non-starter, as they push the C++ requirement too high, IMHO. What other features are left to quibble over?

Incidentally, we have a couple of C++ files as part of the Windows build, so compilation should break if anyone introduces unsupported things into the headers. They build with C++20 enabled in CI though (if available), because argument clinic requires it, so they’re only half a test.

1 Like

Looks like the best path forward is testing both C11 and (some specific version of) C++, while documenting C89 plus a list extensions.

PEP 7 currently lists C11, so it would need changing for the argument to authority.

I have the tests on my TODO list. I probably won’t get to them any time soon, but when I do I’ll know what to do! Of course, anyone is free to work on them (and I can mentor someone who knows C & shell).

There are fixed-width integers and static inline functions, but I don’t think those are up for quibbling over :‍)


This is coming up again with the immortal objects implementation. It currently uses a C++ compatible C11 feature in Include/object.h: an anonymous union inside of our struct _object. For performance and ease of implementation reasons. See: object.h uses an anonymous union in a struct (older C incompatible) · Issue #105059 · python/cpython · GitHub

Discussion in this thread suggesting that our header files shouldn’t use modern C++ compatible C11 syntax and features makes no sense to me. We declared that CPython 3.11 and later require C11 and C++ compatibility in PEP-007. Period. We neglected to specify the C++ version but it seems fair to read that as implying C++11; C++ already supported anonymous unions for simple types before that as far as I can tell.

I don’t like the idea of reverting PEP-7 to “we can use modern features, except not in header files because someone on ancient platforms that somehow has a functioning build of CPython but are only willing to use a semi-modern compiler will have problems”. People in that unusual boat are free to stick with CPython <=3.10 until they finish bailing themselves out.

Visual Studio 2019 includes C11 and C17 support.

Want a modern Python? You need at least a semi-modern build toolchain.


Like all decisions, this can be reverted. (Including in 3.12, since it turns out that the 3.11 the headers actually remained C89-with-some-C99.)
Does it hurt enough users to consider reverting? Probably not, but we can’t really know. Unfortunately, the users affected by this are likely to be very late adopters.

someone on ancient platforms that somehow has a functioning build of CPython but are only willing to use a semi-modern compiler will have problems

AFAIK it’s just not just about compilers, but compiler settings. Updating the compiler should go fine (modulo the inevitable issues), but switching the C standard is a backwards-incompatible change.

I agree with not bothering with C++ compilers pre-C++11. As for C, we might want to be more conservative as embedded platforms tend to use vendor-specific compilers that lag behind the latest standards. But I don’t have any data.

1 Like

Great – being late adopters means there’s even more time for them and their platforms to get their compilers up to date, which makes it even less of a reason to constrain CPython development in the now.

… which was announced for 3.11, and no-one has complained publicly in the year that has passed since then?


I reported the issue because it broke my build, and I am definitely on the latest compilers. The problem is that I’d configured that particular build to be more standards compatible, and so the break is because while the compiler has supported that feature unofficially forever, it now warns that I’m using something that isn’t actually in the relevant standard.

Yes this is fixable by disabling standards mode. But it’s also fixable by keeping our headers clean, which has been the approach for far longer than expecting our users to update their build scripts for a new release. It’s probably okay in my case, but for some users changing these settings will impact the rest of their project, and they’ll (justifably!) have to defer updating CPython because it’s too costly to update the rest of their codebase.

I disagree that this seems fair. The public API impacts the entirety of our users code, which is far beyond our scope. The settings for our own build only impact our own build, so we get to choose what it is.

Requiring a particular language version in the public API is the equivalent of requiring a particular operating system version (in some case, this is a direct equivalent). Yes, we do it, but only when the burden of supporting the older version is too great.

We should be dropping the old autoconf before we drop C++ prior to C++11 :wink:

1 Like

There is no great benefit to using anonymous unions. They can make the code a bit cleaner, but it doesn’t make that much difference.

When converting _Py_CODEUNIT from a typedef of a uint16_t into a proper union, I used an anonymous union. Steve pointed out that it broke his build, so we changed it.
Instead of inst->opcode, you write inst->op.code. No big deal.
Compatibility seems more important than saving a dot.

1 Like

In this specific case adding a . to code is a non-starter because the struct in question is a Public API.

Our only guiding PEP on this is PEP-7 and it says we can use C11 everywhere (Per the PEP 7: Python 3.11 uses C99 and C11 by vstinner · Pull Request #2309 · python/peps · GitHub update that came about in part due to Python can now use the C99 NAN constant or __builtin_nan() · Issue #90798 · python/cpython · GitHub).

If we want to declare that we cannot use C11 in some portion of our headers we need PEP-7 to be updated to say that and needs to include specific rationale as to why and when we can change.

Before accepting such a change I’d want answers to the questions of who specifically are we spending excess time to support because they cannot use a C11 capable compiler and why are we’ve decided they are worth our time to spend extra engineering time to support? If we can’t answer that then we’re doing all of this for no good reason. Lets not jump through hoops to do this just because we always have.

We need to identify a compelling reason that isn’t a bunch of hypothetical “maybes” and “someone mights” or “a mythical non-tier secretive platform could be stuck in the 90s”. Do we have any?


Latest MSVC in C mode with -W4 argument raises four warnings from our headers:

> cl /c /I(python3.12 -c "import sysconfig; print(sysconfig.get_config_var('INCLUDEPY'), end='')") /W4 .\t.c
Microsoft (R) C/C++ Optimizing Compiler Version 19.37.32619.1 for x64
Copyright (C) Microsoft Corporation.  All rights reserved.

...\Include\object.h(173): warning C4201: nonstandard extension used: nameless struct/union
...\Include\cpython/unicodeobject.h(203): warning C4100: '_unused_op': unreferenced formal parameter
...\Include\cpython/unicodeobject.h(393): warning C4100: '_unused_op': unreferenced formal parameter
...\Include\cpython/pytime.h(192): warning C4115: 'timeval': named type definition in parentheses

Latest MSVC with Microsoft extensions disabled produces 7 (few repeats):

> cl /c /I(python3.12 -c "import sysconfig; print(sysconfig.get_config_var('INCLUDEPY'), end='')") /Za .\t.c
...\Include\object.h(173): error C2467: illegal declaration of anonymous 'union'
...\Include\pycapsule.h(31): warning C4224: nonstandard extension used: formal parameter 'destruct
or' was previously defined as a type
...\Include\pycapsule.h(45): warning C4224: nonstandard extension used: formal parameter 'destruct
or' was previously defined as a type
...\Include\cpython/pytime.h(192): warning C4115: 'timeval': named type definition in parentheses
...\Include\cpython/pytime.h(198): warning C4115: 'timeval': named type definition in parentheses
...\Include\cpython/abstract.h(100): error C2097: illegal initialization
...\Include\cpython/abstract.h(148): error C2097: illegal initialization

For apps that build with all warnings as errors, this makes CPython unusable.

Enabling /std:c11 (which changes the behaviour of my entire program, not just the CPython headers :fearful: ) only fixes the one /W4 warning about anonymous unions, as well as forcing my hand on Microsoft extensions (didn’t check which direction, but I can’t use /Za /std:c11 to disable them, which means they’re either permanently disabled or enabled).

C++ mode doesn’t produce the anonymous union warnings, but it does produce the unreferenced parameter warnings.

/Wall is flooded with struct padding and discarded static inline functions (mostly ours), but if you disable those two warnings you only get unreferenced parameters and a few mismatched signed/unsigned types.

> cl /c /I(python3.12 -c "import sysconfig; print(sysconfig.get_config_var('INCLUDEPY'), end='')") /std:c++14 /Wall /wd4514 /wd4820 .\t.cpp
Microsoft (R) C/C++ Optimizing Compiler Version 19.37.32619.1 for x64
Copyright (C) Microsoft Corporation.  All rights reserved.

...\Include\cpython/unicodeobject.h(203): warning C4100: '_unused_op': unreferenced formal parameter
...\Include\cpython/unicodeobject.h(343): warning C4365: '=': conversion from 'unsigned int' to 'int', signed/unsigned mismatch
...\Include\cpython/unicodeobject.h(367): warning C4365: '=': conversion from 'unsigned int' to 'int', signed/unsigned mismatch
...\Include\cpython/unicodeobject.h(393): warning C4100: '_unused_op': unreferenced formal parameter
...\Include\cpython/longintrepr.h(121): warning C4365: 'initializing': conversion from 'uintptr_t' to 'Py_ssize_t', signed/unsigned mismatch
...\Include\cpython/abstract.h(60): warning C4365: 'return': conversion from 'size_t' to 'Py_ssize_t', signed/unsigned mismatch

Obviously enabling /WX is going to break these users entirely, and if I cared enough about correctness to enable this (which I do in some cases), I wouldn’t be comfortable with disabling the relevant warnings across my entire application. I’d fork and patch the header files.

So my summary here is that we are impacting users who care about warnings, and are unwilling/unable to change the language of their entire project to accommodate one of our implementation details.


Steve, could you tell us more about these users? (I presume they might be internal MS projects that you work with?) Are they building Python from source or compiling extensions?

Literally me, and yeah, they’re internal MS projects. In one case I’m compiling extension modules (though I am less concerned about warnings on that one), and the main one is recompiling python.c with additional audit hooks and quite a bit of security sensitive code.

So a little bit secretive, but not mythical/maybe/might. I’ve reported this kind of issue a few times because my code literally breaks when I update CPython and I have to stop making builds available to my internal users :slight_smile:

[Later] I guess the “unwilling/unable to change their language” is stretched a bit with it being me. I’m definitely aware of many projects who would not be willing or able to make that change, but the intersection between those projects and mine (or others I know of that build against 3.12 alpha/betas) is zero.

1 Like

Thanks Steve! My takeaway is that Microsoft hasn’t really landed C11 in MSVC yet if it is not the default compilation mode. Their philosophy on default compilation mode is apparently the opposite of what the rest of the world does? Clang and g++ default to recent C & C++ standards, updating that fairly regularly upon new releases. People who want something older need to explicitly specify that on their compiler command line.

A consequence of this is that people using the MS toolchain could see warnings when including our header files depending on which level of warnings they have turned on. If they use a -Werror equivalent with that, those would be build failures that they’d then have to adjust their build to work around it on those files.

So we’d be doing this “for Windows users” because their toolchain doesn’t cope by default?

Is there ever a way out of this trap?

How’re we supposed to encode a rule of “our public .h files cannot use any feature that Microsoft does not enable by default in their latest compiler” in PEP-7?

Long term wish: Any idea who can be poked in Microsoft to move that needle towards better aligning with what Clang does in the long run? (or ever better… just give up and stop working on Microsoft’s own compiler and have them work on improving Clang and shipping that like everybody else)?


PEP 7 is a style guide. It starts out with “rules are there to be broken”.
Let’s come up with a new PEP, rather than update this one?

That wording is already much more specific than the current “No compiler warnings with major compilers (gcc, VC++, a few others)”.