Python 3.14.0rc3 is go!

It’s :magic_wand: finally :magic_wand: the final 3.14 release candidate!

https://www.python.org/downloads/release/python-3140rc3/

Note: It’s another magic release. We fixed another bug that required bumping the magic number stored in Python bytecode (.pyc) files. This means file .pyc files created for rc2 cannot be used for rc3, and they’ll be recompiled.

The ABI isn’t changing. Wheels built for rc1 should be fine for rc2, rc3 and 3.14.x, so this shouldn’t affect too many people.

This release, 3.14.0rc3, is the final release preview. Entering the release candidate phase, only reviewed code changes which are clear bug fixes are allowed between this release candidate and the final release.

The next release of Python 3.14 will be the final release, 3.14.0, scheduled for Tuesday, 2025-10-07.

There will be no ABI changes from this point forward in the 3.14 series, and the goal is that there will be as few code changes as possible.

Call to action

We strongly encourage maintainers of third-party Python projects to prepare their projects for 3.14 during this phase, and publish Python 3.14 wheels on PyPI to be ready for the final release of 3.14.0, and to help other projects do their own testing. Any binary wheels built against Python 3.14.0 release candidates will work with future versions of Python 3.14. As always, report any issues to the Python bug tracker.

Please keep in mind that this is a preview release and while it’s as close to the final release as we can get it, its use is not recommended for production environments.

Core developers: time to work on documentation now

  • Are all your changes properly documented?
  • Are they mentioned in What’s New?
  • Did you notice other changes you know of to have insufficient documentation?

Major new features of the 3.14 series, compared to 3.13

Some of the major new features and changes in Python 3.14 are:

New features

(Hey, fellow core developer, if a feature you find important is missing from this list, let Hugo know.)

For more details on the changes to Python 3.14, see What’s new in Python 3.14.

Build changes

Incompatible changes, removals and new deprecations

Python install manager

The installer we offer for Windows is being replaced by our new install manager, which can be installed from the Windows Store or from its download page. See our documentation for more information. The JSON file available for download below contains the list of all the installable packages available as part of this release, including file URLs and hashes, but is not required to install the latest release. The traditional installer will remain available throughout the 3.14 and 3.15 releases.

More resources

And now for something completely different

According to Pablo Galindo Salgado at PyCon Greece:

There are things that are supercool indeed, like for instance, this is one of the results that I'm more proud about. This equation over here, which you don't need to understand, you don't need to be scared about, but this equation here tells what is the maximum time that it takes for a ray of light to fall into a black hole. And as you can see the math is quite complicated but the answer is quite simple: it's 2π times the mass of the black hole. So if you normalise by the mass of the black hole, the answer is 2π. And because there is nothing specific about your election of things in this formula, this formula is universal. It means it doesn't depend on anything other than nature itself. Which means that you can use this as a definition of π. This is a valid alternative definition of the number π. It's literally half the maximum time it takes to fall into a black hole, which is kind of crazy. So next time someone asks you what π means you can just drop this thing and impress them quite a lot. Maybe Hugo could use this information to put it into the release notes of πthon [yes, I can, thank you!].

Enjoy the new release

Thanks to all of the many volunteers who help make Python Development and these releases possible! Please consider supporting our efforts by volunteering yourself or through organisation contributions to the Python Software Foundation.

Regards from wonderful Cambridge,

Your release team,
Hugo van Kemenade @hugovk
Ned Deily @nad
Steve Dower @steve.dower
Łukasz Langa @ambv
Savannah Bailey @savannahostrowski

30 Likes

That’s :banana: :banana: !

8 Likes

This might be the strongest argument for the Tau Manifesto I’ve ever seen.

8 Likes

CI images are ready.

1 Like

This is really disruptive. For conda-forge, bumping the magic number is essentially the same in terms of impact as breaking the ABI – i.e. we need to fully rebuild all packages again, because the invalidation of the bytecode published in existing artefacts for python 3.14 would lead to massive performance regressions upon first use.

In the current migration based on rc2, we’ve built 1000+ packages for python 3.14, many of which only get rebuilt rarely (in some cases only when a new python version comes out), and so we cannot rely on the fact that these regressions would just sort themselves out over time.

Granted, it’s not quite as bad as an ABI break, in the sense that we’re not causing crashes in user environments if unhandled, but it causes a very real strain on our infrastructure and volunteer maintainer time.

If this keeps happening, a possible outcome would be that conda-forge stops building for the release candidates (this option was raised as a reaction to the PR to trigger a rebuild), which I think would be a loss for everyone, because it would mean going back to the days where compatibility with a new python version would take months, rather than having launch-day availability of 1000s of packages (and to a lesser degree, would push back the contributions of some of our maintainers to get upstream projects compatible with the new version).

As such, it would IMO be reasonable to treat these magic number bumps like ABI breaks - off-limits during the rc phase, and definitely after rc2.

2 Likes

But wouldn’t the trade-off be that some bugs can’t be fixed? I would think that finding and squashing bugs like the one described is the primary goal of the release candidates.

9 Likes

I think it’s good to bring this up, and something to be mindful about when considering late-phase magic number bumps, although as @jamestwebber points out, prohibiting them means some late-phase critical bugs might not get fixed, and that’s not good either.

But diving into the root problem a little more, leads to a question about why pyc files are included in conda packages in the first place. Maybe there’s good reason to do so, but IIRC from my Debian/Ubuntu days[1], they don’t include pyc files in the artifacts, but instead bytecompile them on install. Hmm, yes that still seems to be the case.


  1. and it’s been ~8 years since I was active there, so :person_shrugging: ↩︎

2 Likes

I understand that the trade-off is not attractive either way. If you follow the link to the discussion I posted, you’ll see that I said it’s understandable to fix a crash.

But if a bug fix needed an ABI break, it probably[1] also wouldn’t be done, because – next to finding bugs – that would invalidate one of the key goals of the rc phase, which is that the ecosystem can start building out the graph of dependencies, and rely on the fact that python won’t break the things that get created in the process.

Basically, this is about making transparent the cost of bumping the magic number, and that this should only be done if the costs of not doing it outweigh the cost of doing it (e.g. taking into account how rare are the conditions that cause a problem, and how big the impact, etc.)

It’s a natural consequence of doing binary distribution. You want the packages to be ready upon Installation, not wait minutes or longer upon first use while the bytecode for N dependencies in your environment compiles.


  1. as always, depending on circumstances ↩︎

2 Likes

But isn’t that solved by at-installation precompiling (like Barry says)?

3 Likes

I don’t know the history of conda’s choices here, but conda installers like micromamba don’t ship with a version of Python, so cannot compile the bytecode at installation.

And personally, I could easily be sold on the argument that compiling per build vs. compiling per install is a good efficiency tradeoff for popular packages.

2 Likes

But if there’s a package that has Python code that needs to be compiled to bytecode, installing that package into an environment will be pointless unless a version of Python is also installed in that environment[1]. I’m not actually sure whether the conda install logic installs each package in dependency order, but it seems reasonable that it might. If it does, then a Python package would have Python as a dependency and so could assume Python is already installed by the time the package is installed, and thus it might be possible to work something out where the install process for a Python package runs a byte-compile step. But it would presumably require some tweaks to the conda install mechanism.


  1. barring some edge cases like a non-Python package with some optional Python parts ↩︎

Perhaps I’m not getting your point here. There are two times during the lifecycle of a conda-package that can be called “at installation time”. The first one happens when building the package, which is determined (quite literally) by installing the package into a controlled environment, and comparing a before/after snapshot to determine which files were added (and then processing those to be relocatable and and packaging them into an artefact that’s essentially a compressed archive plus metadata). This is the natural time to do precompilation of anything, including python bytecode.

The second time that could be called “at installation time” is when a user installs the artefact that was created by the above build process. At this point we don’t want to do any further work except unzipping an archive and putting its content into the filesystem. It’s the job of the metadata (e.g. which python version, which library dependencies etc.) and the solver to ensure that everything is functional at that point.

1 Like

I don’t think it’s particularly useful to dissect conda-forge’s choices in this matter. They include precompiled bytecode in binary builds. Whether that’s a good choice or not, it means that a bytecode magic number change is problematic for them, and therefore the impact of such a change on the ecosystem as a whole is somewhat bigger than maybe we were aware.

It’s not like we make changes like this during the RC phase lightly. Any information about impact is useful to the RM when they make the call, but we’ll still make such changes if it seems like the benefits are greater than the costs.

If conda-forge want advice on whether precompiling is a good idea, I’m sure there are plenty of people who will offer opinions :slightly_smiling_face: But it feels like it’s probably off-topic for this thread.

18 Likes

Thanks Paul!

My observation (no finger pointing) is that @hugovk was very cautious when the first magic build number bump happened for rc2. But no similar discussion was had (at least publicly, to my knowledge) when another bump occurred for rc3 – ironically, this could be because no-one screamed when it happened for rc2, but at least in our case, that was because we waited with starting the rebuilds until rc2 was available precisely due to this issue.

2 Likes

@h-vetinari Thanks for letting us know, it’s appreciated to get feedback like this.

I was cautious this time as well. The decision to bump the magic number again wasn’t taken lightly, we discussed this in our internal Discord and decided it’s better to fix this bug during RC, and definitely before final release.

Because the first one meant we needed an earlier RC2, and I’d already slotted in an extra RC3, and there was less than a week until that (planned) RC3, it didn’t seem worth moving that one earlier.

For next time (but hopefully there won’t be one…), would it have helped if I’d posted that the second bump was coming?

8 Likes

Knowing this earlier is (ceteris paribus) always better, because then we can just pause the migration and thus reduce the amount of work lost.

But my hope would be that:

  • such magic number bumps can be kept out of the rc phase entirely, and definitely after rc2 (until then we’ve generally only rebuilt a small subset of packages, so a restart is less costly. But the rebuild ramps up quickly from there, and now we’re looking at throwing away 10000s of CI hours, and redoing a substantial amount of human effort for shepherding the rebuilds).
  • where a need for such bumps arises during the rc phase, that the impact is weighed against the (large, mostly externally-borne) cost.
6 Likes