TLDR: How has the decision making around macOS version support been done historically, and when does it next sense to revisit the current floor (which I take as 10.9 based on the installer)?
Python 3’s support for macOS (according to the download page) is 10.9, though from searching here and GitHub I couldn’t find where and when this choice was made.
This gap means that Python users on macOS versions on 10.9 through 10.11 risk being cut out from the Rust portion of the package ecosystem. The solution for now is for me to make it clear in PyO3 documentation that Rust packages should avoid setting a minimum Rust version of 1.74 or higher, though this may eventually create tension if Rust projects need newer compiler features.
(We already made use of some extremely helpful Rust 1.75 features in pydantic-core so I am now having to think hard about how / if I can follow my own advice and downgrade.)
PyO3’s next release will update its own minimum Rust version from 1.56 to 1.63; I anticipate that a minimum of 1.74 would be quite comfortable to support for many projects for another year or two. This conversation is therefore not urgent but I would still be glad for an understanding of what the process is if I were to ask CPython to also consider a minimum of macOS 10.12.
This does not answer your question, but FWIW for pyca/cryptography we’ve been setting a minimum macOS version of 10.12 since our Jan 1, 2023 release. Prior to that, we’d been at 10.10 since our Jul 20, 2020 release. I don’t believe we’ve ever gotten a complaint about these minimums.
This is by way of saying: a minimum of 10.9 appears to be incredibly conservative and probably not necessary.
By minimum macOS version, is that just what you distribute or do you actually depend on features of the newer OS?
If it’s just a distribution choice it’s possible that users on older macOS are building from source. That might change once cryptography’s minimum supported Rust version crosses past 1.74 and they can no longer build at all.
The precise versions I mentioned are what we distribute. However, I believe we may have been relying on some newer macOS features that also imposed a minimum bound. (We have a user who distributes a popular desktop app that includes us, and our minimum macOS version has mostly tracked their’s.)
Well, it seems reasonable to me that the core language distro is more conservative than (some) parts of the ecosystem. This allows people stuck in a time capsule to at least use the core distro to keep hacking in their own restricted world. I don’t see this as making a commitment on behalf of other parts of the ecosystem – you need to decide for yourself what your users’ needs are – but I also don’t see that just because the ecosystem has moved on that requires the core language to also shift its suppport. That should be guided by the core language’s needs (maybe we cannot do without some feature that only exists in a newer OS version, or maybe Apple makes it hard to support old OS versions with newer compiler toolchains), weighed against the desire to leave no user behind.
There are at least two questions here, I think: 1. What is the minimum version of macOS that we support when building cpython from source? and 2. What is the minimum version of macOS that we support in the "official” python.org binary installers? Up to now, we haven’t had a written (PEP) policy for either, unlike for Windows, and we should; I’ll try to finally to get to it soon.
As far as installers go, our unofficial policy has been to try to cater to as broad a range of macOS hardware and operating systems as practical, particularly considering that we believe a primary audience for the macOS installer to be the education community where the use of older hardware and software is probably more common than in many other segments. Unfortunately, we have very little hard data that I am aware of on what configurations the macOS installers are being used. So, yes, we’ve tried to be very conservative. And there have been some good reasons in the past for our choices when changing the minimum supported version, based mainly oin Apple product decisions like dropping PPC support or 32-bit support.
One thing that has changed over the years and perhaps we haven’t emphasized enough is the big change to more fully support weak-linking of Python on macOS meaning we can now reliably build cpython on newer versions of macOS but continue to support older versions lacking newer APIs and features through the use of weak-linking and execution-time checks. This was added as part of the support for Apple Silicon Macs in 3.9.1+. Previously we had to actually build cpython on the minimum-supported OS version thus preventing use of newer OS features; that is no longer the case. So, it is by no means mandatory that packages need to be built with the earliest minimum version of the Python installer itself.
That said, despite the lack of hard usage data, there are good reasons to consider raising the minimum supported version of the macOS installer. A big one is that Apple no longer fully supports OS versions older than macOS 10.13 in the latest versions of Xcode and Command Line Tools tool chains. And it is increasingly harder to build and test newer major features, like free threading and JIT compilation, on the oldest systems. So this question is timely. I’ve committed to providing optional (and experimental) free-threading cpython binaries for the 3.13 release through python.org. Unfortunately, that didn’t quite happen in time for 3.13.0 beta 1 but it will be available in an upcoming beta (beta 2, if at all possible). As part of that, we will bump the minimum supported version to 10.13 for at least the free-threaded binaries but I’d like to do that for the traditional (GIL) binaries as well. Given the lack of hard data, I think the best approach is to just make the change and ask for any feedback in the release notes and the installer READMEs. We can also point users of older macOS systems (like 10.9 through 10.12) to other distributors of Python for macOS. In particular, the MacPorts project provides best-effort support and often pre-built binaries for many versions of Python as well as third-party packages on a very wide range of macOS versions (even earlier than 10.9). And we would have the option to potentially revert to 10.9 prior to 3.13.0 final if the change proves to be too impactful for our users.
Also, note that the above is strictly for the python.org installer binaries. We aren’t planning to do anything further for 3.13 to change what macOS versions are supported when building from source, i.e. things should continue to work as well as they do with 3.13b1 when building on older systems. And, for those who would need or prefer to build cpython themselves, targeting for a single OS release and hardware architecture is generally much simpler and more likely to work than trying to support a range of hardware and software versions as we do in the installer releases.
Thanks all, yes I completely agree that we should be keeping the minimum requirements as low as possible to be inclusive.
This allows people stuck in a time capsule to at least use the core distro to keep hacking in their own restricted world.
I agree with this and I feel that tools like PyO3 which support the ecosystem potentially need to be even more conservative because major packages in the ecosystem should ideally at least be buildable from source on macOS releases in line with EOL Pythons rather than latest.
I also completely agree that CPython shouldn’t be basing its principles of what to support on what other pieces of the ecosystem want to support.
That said, the proposal to change to a minimum of macOS 10.13 seems totally reasonable to me and means that in ~5 years time when Python 3.12 is EOL then we will have no concern at all about updating the minimum past Rust 1.74 (though we may choose to update the minimum some time before then).
I think in general that most people will base their expectations of CPython’s macOS support on the installers, and even if it is buildable on source on older versions, this would naturally be limited to a set of advanced users who would also be prepared to go through hoops to get the latest versions of tools on their aging OS.
I think it would be very beneficial to create some sort of policy document, because it’s a necessary “clock” to know what kind of features in C are available for general use (e.g. aligned allocation, C11 atomics, etc.). Xref also a previous time this came up: Moving packaging and installers to macOS 10.13 as a minimum
Establishing a somewhat predictable cadence will be useful not only today, but long into the future. C’s evolution has not stopped, and being able to (somewhere down the line) use features from the very bountiful C23 etc. is going to be worthwhile.
Additionally, Apple pushes users hard to upgrade, so the usage numbers drop off sharply. Almost a year ago, 99.95%(!) of pillow’s macOS downloads were on >=10.13. Obviously a single library is a small sample, but the picture is pretty clear IMO. At some point the benefits of contortions to keep the very long tail of support going are really questionable IMO.
For example Chrome (which has a very wide user base) dropped support for <10.13 last year, dropped <10.15 in the meantime, and will drop support for <11.0 with Chrome 129 in ~October. Chrome support for macOS is also how google defines its overall lower bound as expressed in it’s “foundational support matrix” (which forms the baseline for other google projects like abseil, protobuf, etc.). Chrome was also the last holdout for LLVM’s libcxx 17+ to move its baseline to 10.13.
These projects might not be directly relevant to CPython, but they show up in a lot of places in the ecosystem. The combined effect of such key projects moving on was a lot of work in conda-forge that was necessary to support raising our baseline to 10.13 (just to keep compiling up-to-date versions), which was recently completed. It’s been almost 8 years since we had bumped our baseline to 10.9 – if we applied the same standards today as then (last non-EOL version), we’d need to jump to 12.0 today. So from my POV, even 10.13 is extremely conservative today, and already small quality-of-DX improvements would make that bump worth considering.
There’s a 3th category in between, what’s the minimum version of macOS we “support” deploying to when someone else builds on a newer version of the OS. Currently that’s the same version as (2), but it doesn’t have to be.
The weak linking support has a cost in a more complicating code base, particularly in de posix extension that’s already fairly complicated due to cross-platform support and support for various variants of the underlying C API (dir_fd and/or follow_symlinks).
Dropping support for deploying to 10.9 using later SDKs would help in that regard.
In practice this still works, but indeed Apple has documented that the deploying to older versions is no longer support.
What doesn’t help is that Apple doesn’t have a clearly stated support policy for macOS, in earlier discussions it was mentioned that they in practice seem to provide security updates for the last 3 major releases (current macOS 12 and later).
We “should” have a clearer policy on what versions of macOS we support and what happens when versions of macOS are no longer support by us. E.g., do we remove weak linking support for 10.9 if we’d move to supporting 10.13 in Python 3.14?
That is:
What’s the oldest version SDK that we support building and running with ?
What’s the oldest macOS version that we support as a deployment target?
What’s the oldest macOS version that we support in our installers?
Those can have three different answers, or we could choose to pick one answer for all three (AFAIK the latter was chosen for our Windows support).
Looking at https://everymac.com/systems/by_capability/maximum-macos-supported.html we could move from 10.9 to 10.11 as the minimum OS version without necessarily loosing hardware (although users may have other reasons that hardware support to say on older versions). Moving to 10.13, 10.15 or even 12 would drop more hardware, but fairly old systems: Even when moving to macOS 12 as the oldest version we support would only cut of hardware that’s at least 10 years old at this point, assuming I read the page correctly.
I haven’t tried to look at software features that were dropped in various versions and might cause users to hang around older version of the OS.
One other reason for looking at switching to at least macOS 11 as the oldest version we still support for at least the installers is that would give us the same SDK level for both supported architectures, removing one possible source of differences between the arm64 and x86_64 binaries in our installers.
Another reason to move to a newer version than 10.13 is that this reduces the risk that Apple drops support for the oldest deployment target halfway through our support window. Although I have no idea what mach-o/dyld and linker features would require a newer version than 10.13, those being the most likely reason for dropping support for older deployment targets.
My personal use cases no longer involve hardware that cannot run the latest OS version and haven’t for a long time by now. Hence mentioning a more ambitious cutoff than 10.13 .
MacPorts has buildbots back to 10.6. There are developers having 10.4–10.5 as well (though no buildbots for these).
Anyone with Intel hardware can install 10.5 or 10.6 in a VM and test builds even for PowerPC. Apple Silicon will be less easy for that purpose, but presumably Qemu should work.
There is no problem in testing for legacy systems.
It is an unfortunate confusion of what is supported in a sense of being actively spent time on by a project developers (understandably, often a few very recent mainstream OS) vs what platforms a software can actually run on. This often results in completely wrong assumptions (I often hear something like “but that requires C++17, it won’t work on 10.6” – while in reality C++23 is supported) which, coupled with inaccurate statistics, lead to a weird situation where genuine fixes to the code base are rejected as developers assume no one benefits from those.
This is quite upsetting, and it leads to unnecessary fragmentation of development efforts. Say, we got local fixes for Python in MacPorts, because upstream has no interest to improve the code for “unsupported” OS, even when the only thing required is to accept results of work already done by someone else.
How do you deal with features that require support from the C standard library? Unless you start shipping your own C standard library to users, I don’t see how something like std::aligned_alloc can ever work.
It’s not with evil glee that I advocate dropping old versions, but driven by practicalities. First off, it’s really hard to test older MacOS versions at scale. For example, conda-forge’s primary CI provider (for GHA and Azure Pipelines) is about to dropmacOS-12 in a month and from macOS-13, the oldest present XCode is 14.x, which dropped support for targeting anything below 10.13[1].
This kind of situation is applicable to basically any compiled FOSS project. If the (up-to-date, easily available) default toolchains for your platform don’t support a target anymore, it’s essentially dead. Have you tried asking Apple/LLVM not to drop support for old MacOS versions? That would perhaps have the biggest impact.
Likewise, if nothing ever broke, then I wouldn’t have reason to advocate raising the baseline either. However, given that conda-forge still targeted a 10.9 baseline until the middle of this year, I can definitely say that stuff regularly breaks, and as someone who’s spent 100s of hours on this, I consider it a completely ludicrous cost-benefit ratio (not just for myself, but even more so for less experienced contributors) to chase after bugs that affect far less than 1-in-10’000 of our users, running on decade-old OSes.
People are free to spend their free time however they want of course, but you can’t expect that others will sign up to support such fringe use-cases, or pessimize their code maintainability to scratch your favourite itch.
PS. This discussion between us is a bit of a déjà-vu
notwithstanding the point that conda-forge uses its own compilers (though of course LLVM – XCode’s upstream – also dropped support below 10.13 in v17.0; at least for libc++, which we cannot avoid updating) ↩︎
Okay, let’s slow down a bit and stop getting so close to name-calling (e.g. “unfortunate confusion”, “completely wrong assumptions”, “ludicrous”, fringe use cases”, and other phrases with negative sentiment).
Everything has been said, unless one of you wants to forge a compromise.
I like supporting as many users as possible. The only problem – and this isn’t personal in the slightest – is that resources are finite, and every project has to balance the long tail of support with all other maintenance tasks.
From personal experience, it takes a lot of work to support very old MacOS versions in our distribution[1]. In a perfect world that wouldn’t be the case[2], but since it is, doing all that work for an absolutely miniscule[3] slice of our user base is hardly a judicious use of resources, because it means sacrificing fixes/updates/features for 99.99% of users.
So I agree that longer support is better – no questions there. IMO it’s Sisyphean to fight the tide of whatever lower bounds are imposed by default platform compilers, but I’m not stopping anyone from trying.
I think the only point of disagreement might be how much responsibility a given project (like CPython) has to support such long-EOL systems. My position is “essentially zero”, @barracuda156 seems to imply that it’s non-zero. In that case, perhaps PEP 11 needs a distinction based on #some_rule_regarding_macOS_version, and someone needs to champion Tier 3 support for whichever versions fall below the cut-off.
it suffices that out of 100 failures, 5 are really hard to fix, but every failure has a cost and may already block junior maintainers completely ↩︎
or at least build failures would get fixed by a magic contributor that’s always responsive ↩︎
hopefully that’s a neutral statement for less than 1-in-10’000 users ↩︎