I do wonder if there might be something that could be done in terms of the way we promote CPython releases. At the moment, docs.python.org
and the main download links on python.org
switch the instant we publish a new feature release. Folks that don’t know any better are actively pushed into using the latest and greatest version, even though we know the wider packaging ecosystem won’t have fully caught up yet.
Perhaps we could potentially adjust things to give releases a formal “ecosystem update” period after the October release date where the release is promoted out of its pre-release status (so all the usual backward guarantees apply, the support lifecycle timer starts ticking, and all the downstreams doing their own source builds anyway can get their respective release processes started), but the default download links don’t switch yet and the release download page contains a caveat and explanation regarding the nature of the ecosystem update period and the potential impact of adopting a release that is still in that phase? (i.e. if you build all your own packages from source, you’re fine, but if you rely on projects publishing pre-built binary artifacts you may need to wait a while)
For example, perhaps the ecosystem update window could run from October to December, with the default download links only switching the following January. There would still be libraries without artifacts published after that date, but it would mean the actively engaged folks aren’t trying to get the entire stack updated in the time between rc1 and the final release.
Right now, that ecosystem update window exists in practice, but it’s entirely implicit.
Note: This post had started down a more technical path before the above idea occurred to me. I think this is still relevant to the topic as background, so I’ve included it, but it doesn’t provide anything that could potentially improve matters in the near term, so I moved it to the end rather than keeping the post in the order I originally wrote it.
Part of the challenge here is that the scale of the problem varies a lot depending on what domain people are working in (it’s the usual refrain of binary dependency chains being shallow in most domains, but spectacularly deep in data science and machine learning).
I recently dropped a project from targeting Python 3.12 to targeting Python 3.11, since one of the dependencies involved didn’t publish Python 3.12 wheels yet. That’s the nature of package distribution having a long tail, though: in many cases, rebuilds to support a new version are reactive in response to demand rather than proactive, and that demand may not eventuate until the project’s main authors want to upgrade their Python version (with users treating the problem as a version constraint rather than pestering the maintainer about publishing new artifacts), or until the binary artifacts start falling more than a single release behind.
Projects switching to the stable ABI genuinely fixes this problem as the existing binary artifacts continue working even on new CPython releases.
I’m also still mulling over some ideas which came up in the CPython CalVer thread about potentially splitting the way we version the CPython ABI from the way we version CPython as a whole, such that it would theoretically be possible to make a CPython release that remained backwards compatible with the previous release’s ABI (nothing coherent enough to even post an Ideas thread about yet, but I do think there’s potential in the concept).
Increasing the amount of time between the “ABI freeze” date and the “general release” date is unlikely to happen though, as there are genuine technical reasons we picked rc1 as the freeze date for the current scheme where the Python version and the CPython ABI version are tightly coupled, and even with only 2 months, there’s a large portion of the user base that has all the binary dependencies they need available on day 1 (even data science users, thanks to the tremendous efforts on that front from the maintainers of the core data science stack).