Adoption of new Python in PyPI packages, longer RC periods?

Or alternatively, some kind of version marker that allows uploading a prerelease package without creating an entirely new release. Using dev or post versions isn’t quite in the right spirit, though technically it probably works well enough. Overwriting the previous nightly would probably be necessary for size/UX/sanity reasons.

There shouldn’t be any harm in putting nightlies straight to PyPI, but we likely need to change something to make it pleasant.

4 Likes

When CPython puts out a release and calls it 3.13.0rc2 rather than 3.13.0 final it does so very deliberately with the understanding that it reserves the right to make further incompatible changes. Pushing to PyPI is not reversible so it is obvious why package authors would be reluctant to push packages for CPython 3.13 while CPython itself still explicitly reserves the right to break them.

What I would want to release earlier is not really to release earlier but to be able to push builds that people can use for advance testing. I want to make them available for interested parties but without irreversibly pushing them directly into a sort of live production environment that millions of people are continuously downloading from.

2 Likes

As for cibuildwheel, the problem with shipping a binary with an ABI change is that it’s really hard to know you need to fix it, and to fix it

Yeah, I’m worried about asking maintainers to try to patch over ABI breaks. Relatedly, ABI compatibility in Python: How hard could it be? | Trail of Bits Blog

Adjusting the labelling of the releases is pure marketing […] ultimately, the point of all pre-stable releases is that we can make changes

I think an ABI freeze is more than just marketing, also marketing is useful when getting people to try a thing! Empirically, as far as I can recall, we’ve been able to do a good job avoiding ABI breaks. The one case I remember is 3.9.3 (which wasn’t even a pre-release)

I still want to wait for 3.13 final release before releasing a version/wheel […] When CPython puts out a release and calls it 3.13.0rc2 rather than 3.13.0 final it does so very deliberately with the understanding that it reserves the right to make further incompatible changes.

While maintainers do have the ability to prevent user installations prior to final release for regular extension modules, this isn’t true of pure Python (or limited API) wheels, so CPython’s incentives are very aligned with “try to keep already uploaded wheel working” come RC time.

I also found Thomas’ explanation in this thread of why changes in 3.13 release candidate are considered much riskier than the things we backport to 3.13 patch releases quite compelling (otherwise by and large the branch policies seem similar between RC and patch phases).

Anyway, I’d like for risk averse maintainers to not have any reservations releasing wheels in the release candidate stage. This makes me feel more warmly about a gamma phase to help communicate to maintainers who are more risk averse and to avoid any potential dilution of what “release candidate” implies. Something like:

  • 6 months alpha (anything goes)
  • 2 months beta (feature freeze)
  • 2 months gamma (abi freeze, encourage wheel building, cibuildwheel does so by default, etc)
  • 2 months RC (minimal changes, branch locking, really we try to ensure this is maximally similar to final)

Compared to today’s lifecycle, RC1 happens at the same time (2 months before final). But hopefully we’d have more confidence by then than we do today, and subsequently we’d get even more workloads tested during the RC period.

Also do note technically you can yank a wheel or shadow it with a wheel with a higher build tag (or even delete a wheel).

I never realised just how similar the beta and RC candidates are intended to be to the real release […] others in that boat

Thanks for sharing! The good news is the graphs indicate that more people seem to believe this :slight_smile: I also wonder if some of the effect is people getting more accustomed to the yearly schedule

I should note, since people will sometimes look at the absolute values on this plot and say that 3.x isn’t ready to be used: a lot of packages support 3.13

Yup, that’s right. In my sample of 1312 packages we use at work on Python 3.11, only 771 have an explicit wheel or classifier for 3.11, and at the time my workplace upgraded to 3.11 under 600 did

What I’m most curious about though is the “network topology” of these updates

I’ll do some digging and report back if there are nice graphs to be had

Opinionated summary of thread

People seem to agree it would be good to try to extend the period of time prior to release when it’s easy to install third party dependencies, and that this would likely help us get feedback sooner.

While some folks seem in favour of a different release lifecycle to enable that, there’s also support for accomplishing this goal with some form of auxiliary index or versioning scheme or “experimental wheels”. My cynical take is that a solution requiring packaging infrastructure is less likely to be implemented and less likely to be discovered or understood by users.

3 Likes

I’m not clear on this. No-one should use CPython pre-releases in production, as mentioned in release announcements: “Please keep in mind that this is a preview release and while it’s as close to the final release as we can get it, its use is not recommended for production environments.”

Do you worry that, when the final release comes out, people will start using your wheel built against the pre-release and it will be incompatible? Does the ABI guarantee of the RC not help? Would cibuildwheel and Trusted Publishers help for your release process, so you can easily push out new wheels more easily? What else could we do to encourage releasing during RC?

:100:

I’d be more in favour of adjusting the lengths of alpha/beta/RC than adding a gamma phase, which I think would be hard to explain (also “gamma” is sometimes used as another term for RC).

2 Likes

As said above, part of the issue is a labelling issue IMHO (also denoted as “marketing”). People think of a pre-release as something that’s not the final thing, and therefore a moving target.

An alternative solution would be to shorten the RC phase and mark the x.y.0 version as provisional, with the followup x.y.1 version being the recommended one for risk-adverse users.

2 Likes

If CPython says “don’t use this CPython in production” then should that not equally apply to the packages that are built against it? Why should someone make final releases of a package that are built against something that is not recommended for production?

Yes, this is precisely the problem. I don’t want there to exist a known broken version of the package on PyPI.

No, because ABI compatibility is not the only kind of breakage.

Also people downstream are likely to be building from source as well and in my experience the most common kind of breakage for built packages is build failure.

I already use each of these where appropriate.

I disagree with the premise here.

To me it seems evidently backwards for dependents to release before dependencies. There is an expectation here that if B depends on A and A will one day put out v1.1 then B should put out a release for A’s v1.1 before A has even finalised what its v1.1 is going to be. Note that there is a big difference here between B making changes in preparation for A’s v1.1 release rather than B actually pushing out a final release that has A <= 1.1 in its immutable dependency metadata.

The suggestion here is that these releases of packages for use on prerelease CPython should be pushed out because it helps with testing which it does and that is the model that we have fallen into. There should be a way to make the testing possible that is separate from making final releases though.

When it comes to the actual final releases it absolutely makes sense that CPython finalises its release and then package A finalises its release afterwards and then B that depends on A finalises after A and so on. I don’t see why anyone would expect otherwise.

There is some discussion above about what the prereleases versions mean but to me there are just two types of release that matter: prereleases and final releases. Steve made this point clear above by saying that the purpose of designating a release as a prerelease is to reserve the right to make further changes that would not be considered acceptable after the final release. From my perspective the prereleases are just more convenient and stable than testing directly against upstream git.

Let me describe an example in a little detail to make it clear how this looks from my perspective. A package I have recently been developing and maintaining is called python-flint. Apart from C compilers, meson and other build tools the main dependencies for python-flint are CPython, Cython and FLINT. These are the versions of those dependencies that have been released in the last year or so and the next releases that are expected in future:

  • CPython 3.12 and 3.13
  • FLINT 3.0, 3.1 and 3.2
  • Cython 3.0 and 3.1

Out of those 7 releases, 6 have broken python-flint completely meaning that every existing version of python-flint would simply fail to build with the latest version of one of its dependencies. The exception is CPython 3.13 (if you ignore the free-threading build). In most cases the compatibility fix needed in python-flint is trivial but backporting these fixes is not trivial.

This hasn’t been done in the past but what I want going forwards is to have upper and lower version caps for all of these dependencies in all releases. We can and do test against -dev versions of CPython as well as main branch of FLINT and Cython. That is how I already know that Cython 3.1 and FLINT 3.2 which are not released yet will break all existing releases of python-flint. Keeping the git branches in sync does not help with the fact that all past releases get broken though and PyPI does not allow us to go back and add even version constraints like CPython < 3.13 retrospectively.

At the time of the next python-flint final release I want to be able to say that the version constraints listed in pyproject.toml and meson.build correspond to fully tested final release versions of all dependencies (is that unreasonable?). Then I want anyone in future who tries to install some version of python-flint from wheel or from sdist to get a clear error message if they don’t have known compatible versions of the dependencies (including CPython).

Of course it is possible that CPython 3.13.1 might introduce some breakage but at that point we can open a bug report and the onus will be on CPython to fix the issue in 3.13.2 because it would be considered a “regression” between final releases rather than just some change in prereleases. From my perspective this is what it means when the release manager decides to name the latest release 3.13.0rc3 rather than 3.13.0 final: CPython is not quite ready to give us that regression-fixing commitment for 3.13.

In the meantime there are wheels for python-flint on CPython 3.13 but you need to install a prerelease of python-flint to get them: pip install --pre python-flint.

6 Likes

This is more or less how I’ve operated for actual coding projects (when I want to just get something done without considering compatibility matrices), due to the scientific Python stack often taking about that long to work out any kinks. And I’ll probably continue to do so and recommend colleagues to do the same, regardless of how quickly wheels get published, because there’s never a new Python feature that’s worth redirecting scientific coding into producing MREs to post on cpython or numpy’s bug trackers.

Which is just to say: I think the current approach is working fine. Sure, feel free to tweak the process and make things smoother, but for many people, these are details that will remain happily invisible.

Just to point out yet another option: we maintain an Ubuntu 24.04 based CI image which gets automatically updated nightly and always provides the latest of all active Python versions, including with 3.13 free threaded builds.

We had some discussion at the core dev sprint about folding these into other container efforts at the Python org level. @corona10

3 Likes

With PEP 694 I think it could work. You could create a public (i.e. guessable) stage of a pre-release version and let folks test those unpublished wheels out. You could have a rolling pre-release stage which you delete without publishing once your “real” next release is ready.

I think a core piece to make this viable is a single switch option for consumers to opt into all the wheels for the prerelease platform. Hence the beta.pypi.org index. Having to specify non-standard names for a package and each of its dependencies is not going to work out, just as I fear that publishers having to invent version numbers for each prerelease build won’t either.

Perhaps one of the companies with infrastructure could consider hosting a public[1] index that allows overwriting? That way (e.g.) numpy can keep uploading version 2.1.0 every day from their CI system, and consumers always get the latest even though the version hasn’t actually changed (provided they specify that index rather than PyPI).

I think this can be achieved without changing or defining any specs. Just not on PyPI itself.


  1. Or semi-public, in that publishers need one-time approval to publish to it. ↩︎

3 Likes

You can push new wheels to a pre-existing release and you can mask a wheel with another one using build tags. It requires some intervention to do all of that, but it is all doable today.

Not necessarily as CPython is the one changing here while your project may not be, especially if the changes that are happening have nothing to do with your project.

2 Likes

Yeah, I think that’s probably appropriate for packages who release against our RC and then have to rebuild for the final release. I’m not sure it’s okay for publishing nightlies onto PyPI proper.

2 Likes

Maybe something like GitLab’s package registries could be used? Maybe GitHub has something similar?

They used to have plans for it but I guess it didn’t seem worth pursuing and they dropped it altogether:

This is not the reason, but I don’t think the reason has been shared publicly.

In any case, GitHub does not offer Python package feeds, but even if they did they’d be per-repository or per-organisation, when we need something that’s more open.

Azure Artifacts does allow it, and has user management that is separate enough from GitHub organisations that we could provide access to projects who want it. It really just requires an person to admin credentials (we have a Python organisation already, as we use it for builds, though I think this feed would fit under the free plans for anyone anyway).

1 Like

Meta: I wish that both time and package scales were synchronised across the graphs.

The way it is now, it’s quite hard to compare 3.13 to 3.11, or naively it looks like “wow 3.13 is a rock star!”

The fourth and fifth graphs are 3.11, 3.12 and 3.13 overlaid (with the same time and package axes)

2 Likes

Could we get updated graphs now that 3.13.0 is out? I’ve been trying to get the 3.13 classifier in as many places I am part of as possible. I’ve got everything I actively maintain with a 3.12 classifier either updates, or in PR waiting for approvals. pipx is stuck due to a fully capped Poetry package in it’s test suite. Nox will be out today or tomorrow.

IMO, the release period length is pretty good, the problem is that even I don’t prioritize getting classifiers updated, since most everything works unless a dependency needs updating, except for a select few packages (like pybind11). I don’t think a longer RC phase would help any of the packages I work on. Just better messaging around “Betas and RCs should be for packages to prepare for new releases”, and less “wait till 3.x.1 to update” articles. :wink:

7 Likes

Could we get updated graphs now that 3.13.0 is out?

:tada: Sure, here are updated graphs for today:


This is the same sample of 1312 packages we currently care about at work. I tried with Hugo’s top 8000 PyPI packages by download count, but it looked weird (e.g. 400 “mypy-boto3-*” or 100 “pyobjc-*” packages all declaring support at the same time); the work sample feels more real-life representative anyway. Note since the initial post, I improved the python_readiness recipe with a version aware bisection that can find earlier packages with support than the previous naive backwards linear search. Finally, I also included Python 3.10.

8 Likes

Thanks for the updated data, and this is an interesting discussion.

As a maintainer of projects other than CPython, it’s helpful to be able to test pre-releases on CI. I don’t expect all of my dependencies to do the same. At this stage, I’m looking at it through the lens of “what” possibly will need attention when I adopt 3.13 and release a package. I don’t think a longer RC period would improve my workflow as I would be unlikely to release before 3.x.0 hits.

What is the long term goal?

  • a: to get packages to release stable versions during RC so it’s easier to adopt at 3.x.0 release.
  • b: provide mechanisms for package maintainers to test their code earlier in the release cycle
2 Likes