Toolchain upgrade on windows?

Cross-posting from for hopefully wider discussion:

While Visual Studio 16.8 (<-> MSVC 19.28) has just been released, I think it would be worthwhile to consider upgrading the compiler toolchain that’s used to build the CPython windows binaries, particularly before the release of 3.10.

That’s because many libraries (e.g. numpy/scipy) are stuck to the same compilers as CPython for ABI-compatibility, and generally, MSVC is by far the lowest common denominator in terms of C/C++ compliance, cf.

For example, dropping python 3.6 support in scipy should finally enable them to use C++14/C++17, since python 3.7+ is built with Visual Studio 15.7, which has essentially complete support, cf. &

However (& as far as I can tell), the windows compiler version for CPython hasn’t moved since the release of 3.7, cf., and every release without upgrading the toolchain means another year of waiting for the ecosystem to unlock more modern C/C++.

The reason why Visual Studio 16.8 is particularly interesting, is that MS has for a very long time not paid attention to C compliance, and only recently completed C99 support, with C11/C17 following in 16.8 (though as of yet without optional aspects of the standard like atomics, threading, VLAs, complex types, etc.), cf.

Looking at the table from, it would be cool if we could add the last line as follows

===================   ==============   ===================
CPython               MS Visual C++    C Standard
===================   ==============   ===================
2.7, 3.0, 3.1, 3.2       9.0           C90
3.3, 3.4                10.0           C90 & some of C99
3.5, 3.6                14.0           C90 & most of C99
3.7, 3.8, 3.9           15.7           C90 & most of C99
3.10                    16.8           C99, C11**, C17
===================   ==============   ===================

** [comment about lack of C11 optionals]


Can you explain why the compiler version matters?

It should only depend on the runtime version, which has been 14.x since CPython 3.5, but if there’s some other place where the compiler version leaks into extension modules it’d be nice to know about it.


Hey @steve.dower, thanks for chiming in!

Is the dependence of CPython on the C Runtime version documented somewhere? I don’t know whether the current “knowledge” (floating around in numpy/scipy/conda-forge etc. ecosystems, at least to the - perhaps limited - degree I caught up on it; e.g. was just from lack of knowledge, ancient customs or actual problems.

More likely (after some googling), it could perhaps be that the broader packaging ecosystem has not gotten the point about the universal runtime since:

Visual Studio 2015, 2017 and 2019

This is different from all previous Visual C++ versions, as they each had their own distinct runtime files, not shared with other versions.

So IIUC - aside from the compiler version leaks you hypothesised would be interesting - things should work more or less out of the box when using a newer MSVC to compile e.g. scipy for python 3.6?

1 Like

Yep, and if they don’t, it’s a compiler/runtime bug that I can report to the team at Microsoft (not that a lot can be done about some things).

The actual compiler version used for CPython increases whenever I remember to update my build machine, but it’ll stay on v14 for anything that’s already released. Last I heard the MSVC team have no plans to up their version, so the next one should be a while away.

1 Like

Thanks for the explanations @steve.dower!

1 Like

One more question: would you happen to know if the universal runtime is planned to stay universal in perpetuity, or will there (have to) be a break at some point in the future?

1 Like

The universal runtime (UCRT) is tied to Windows now, so it won’t break compatibility until the entire OS does. It also uses API sets, which ensure that even breaking changes on the OS-side can preserve public-facing behaviour for older clients.

The VC runtime is version 14.x right now, and when it moves to 15.x there will be some incompatibilities. These are only going to affect C++ apps, realistically, as practically all the C runtime made it into UCRT. Chances are it’ll be fine to have both v14 and v15 loaded at once, unless exceptions cross the boundaries. Extension modules will be able to try using v15 before CPython moves, which will make testing easier, but until it actually exists we have no way to know what the impact will be.

I finally got around to reflecting this in for the scipy toolchain documentation in DOC: update toolchain.rst to reflect windows universal C runtime by h-vetinari · Pull Request #13713 · scipy/scipy · GitHub, and @rgommers voiced concerns about the impact re: potential subtle bugs and the difficulty of debugging them.

As an intermediate step, I opened POC: Build windows with vs2019 by h-vetinari · Pull Request #165 · conda-forge/scipy-feedstock · GitHub, where Isuru from conda-forge/core pointed out that if scipy were to build its windows wheels with vs2019, users would still need to upgrade their UCRT (presumably through the windows update process?), and that this would complicate all static linkages, because then any library linking to (e.g. npymath.lib) would need a linker that’s at least as new.

Since this new input sounds like the opposite of what you said previously, I wanted to kindly ask you for your thoughts on this. Especially as this would put scipy back to square one regarding using newer toolchains until is solved (+ 4years until that’s the oldest supported python version), which then means not being able to use C99/C11, etc.

Edit: I unintentionally misrepresented Isuru’s statement; he clarified on the issue:

Note that I didn’t mean to say that UCRT needs to be updated. I mean Visual Studio C++ runtime.

1 Like

Thanks for the clarification.

You probably want to statically link the C++ dependencies, if you are using them. At the very least, distribute your own copy of it in your wheel (or require users to install it themselves, which is the better way to do it if possible - conda should be able to reference one of their copies).

CPython does not use or include anything to do with C++ support, so there won’t be any conflicts there. And if we ever start, we’ll statically link them and keep them out of the ABI, so there won’t be any conflicts.

It’s up to the installer (person) to ensure that everything they install works with their C++ library, just as it is on other platforms. And yes, this makes it hard for libraries that want to distribute ready-to-use builds via PyPI, but the alternative would make legitimate scenarios impossible (and increase the maintenance burden on the core team), so I’ll continue to argue that it’s PyPI/packages that need fixing here.

1 Like

Sorry I dropped the ball on responding here - thank you for your responses! I’m still keen to unblock numpy/scipy to use newer C/C++ standards, but it sounds like it’ll take a while…

1 Like