This is also my feeling. It makes the (approximate) release cadence easier to remember. “New Python releases come out in <month>
.”
I think we should try this. The points already being made about release process automation are key. Without that, having subsequent monthly bugfix releases is a burden on everyone involved in prepping binaries and dotting all the i’s and crossing the t’s all over for a new release to happen.
As for LTS, think it’d be a good idea for us to create such a concept. But LTS need not be tied to this and should not block us here. We’ll want to define that concept separately later.
For our side processes with terms that are tied to N release cycles I suggest we update those to N*1.5y for starters unless we see a good reason otherwise. If we define LTS concept in a future maybe we’d want to realign some of those to sync with that cadence.
Here’s another option to put on the list (I don’t have a conclusion about what’s best): go to releasing every 6 months. That divides 12 months, and also divides our old release cycle – folks who liked the 18 month cycle can take every 3rd release. We could potentially make those “LTS” – i.e., they keep the same support term that we have right now, while we shorten the term for the new intermediate releases, to keep the backporting burden under control.
Right now it’s ok to support only the two or three latest versions of Python. But if the release schedule switches to 6 months, it’s probably not ok anymore.
There’s a cost for third-party packages to having to support multiple Python versions. They have to maintain CI and packaging jobs. For most projects this is still light-weight, but not for all. For example, building PyArrow for a given version of Python can take on the order of 10 minutes of CI time on a single platform…
Right, the only way we move to faster feature releases without frustrating all of our users is to avoid making certain changes in each.
For example, guarantee no language/parser changes, or no C API changes, or no library API changes, except at certain points.
And if we align that point at every 18 months, we have our current system.
I’m not even convinced that more frequent releases will necessarily make a difference. We just allocated a CVE today for a regression that was never actually released, because 3-4 Linux distros had picked it up on their own and released it. So apparently our release cycle doesn’t even matter that much?
PEP 387 – Backwards Compatibility Policy | peps.python.org could potentially help standardize this. I know @methane won’t like this, but if we made deprecation go PendingDeprecationWarning
→ DeprecationWarning
→ removal, that actually gets us to 2 years before a feature is removed versus 18 and also a release to see if we are going to be making a mistake with the removal before we more heavily commit to it.
I’m talking about a system like Ubuntu or Django use, where you do fairly frequent releases (6 months for Ubuntu, 8 months for Django), but most releases are only supported until roughly the next release, and you only do an LTS every once in a while (they both do one every 2 years). Right now we effectively do a LTS every 18 months, and every release is an LTS. So yeah there might be some increase in how many Python versions are live at a time, and we should like… draw pictures of all these options to understand exactly what they mean. But I’m not talking about a 3x increase, or even a 2x increase like in the PEP 596 draft, but more like a 1.5x increase.
For features, it definitely does matter.
Security fixes tend to be rushed in sooner. And versions that aren’t supported by the team CPython anymore do get some limited backports.
But 3rd party projects don’t typically drop support of Python versions as soon as python.org do. So while there may be a limited increase in burden for the core devs, it’s not immediately obvious to me that there won’t be a higher impact on 3rd parties.
You seem to assume that people will always upgrade to the next Python version ASAP. But that’s only true in certain contexts. For example, people who use their distro-provided version will by definition be locked to their distro’s release or update schedule. Some people will be locked to company sysadmin practices, etc.
Yeah, it’s hard to say exactly what third-party devs will do. It’s even hard to say what they do today :-). But from a few spot checks: it looks like pip dropped Python 3.4 support ~1 month after python.org did, while django, numpy, and arrow had all dropped it some months before. So at least right now, the python.org support seems to put a pretty solid limit on what third-party devs are interested in supporting.
At some point we’ll want to solicit feedback from them.
One thing that is particularly hard here is we’ve been in a weird time in history with Python. If you wanted to continue to support Python 2.7, supporting additional Python 3.x versions was not typically super high, you couldn’t use any of the new 3.x features anyways because you were locked to 2.7 so it was primarily just regression testing on the various Python versions.
In the future when 2.7 is less of a concern, I don’t think we really have a good “pulse” on what the landscape is going to look like.
More frequent releases will also change things a lot – if each release has fewer changes, then all else being equal it will be easier for third-party libraries to support more releases. How much easier is an open question…
I was bored and having trouble visualizing the different approaches so I drew a picture.
Notes:
- This doesn’t show beta/rc periods, just support commitments for final releases
- It doesn’t try to show what third-party packages would want to support, just what python.org would be supporting
- There are lots of other options too (e.g. the 1 year cadence proposed upthread, or, django is like ubuntu, but w/ an 8 month cadence and every third release is an LTS), but I guess you can imagine what those would look like.
- I’m not arguing for any of these, just trying to understand what the tradeoffs are.
I don’t think so. “Support” doesn’t just mean you have to fix bugs. Fixing bugs is often the easy part, because Python is careful not to introduce breaking changes, so it’s usually just some small details (if any). “Support” also means you have to take care about CI and packaging, and those don’t get easier because Python releases have fewer changes between them.
Yes, I dislike it.
PendingDeprecationWarning is for “not deprecated, but not recommended and will be deprecated later”. It is not for “N-2 Deprecation”.
We should use DeprecationWarning from N-2, or even N-3 release.
Thanks for those diagrams Nathaniel, they help. To me, a Ubuntu style release schedule seems most useful. People who want less churn can use the LTS releases. If you don’t officially designate a LTS release, the community will probably do it ad hoc (e.g. pick what Red Hat releases with).
Now that Python 2.7 is truly dying out, I think we need to be extra cautious about making incompatible changes to 3.x. The PyCode_New change is an example of something that could have been handled better, IMHO. I think with some extra care, it could have been less disruptive.
Let me put my Fedora hat on.
Fedora releases generally live for 1 year, with several months before release for stabilization. If a Python release is supported for 9 months, 3-7 of that before it’s released in Fedora (for stabilization of both interpreter and all the software using it, actually updating the software, and for calendar mismatches), a Python release would go EOL long before the OS would.
In the “Ubuntu style” model above, we’d probably end up treating non-EOL releases as alpha/beta.
In the “PEP596 draft”, we’d probably end up skipping some releases as well.
How do you handle Ruby in this case?
Wearing my Steering Council hat: we discussed this at the SC meeting today, and we’d like the proposal to change the release cadence clearly separated from the Python 3.9 release schedule. The procedural issue is that the cadence change is a proposal that needs collective review & approval by the core development team (including the SC), whereas the schedule itself is an administrative document maintained by the Release Manager.
Switching back to speaking just as myself: given that it’s going to be difficult to write the actual 3.9 release schedule until you know whether the proposal to change the cadence has been accepted (spoiler: the SC is definitely sympathetic to the idea, but there are non-trivial challenges in the concrete details), it probably makes sense to use PEP 596 to propose changing the release cadence (using Python 3.9 as its example release), and defer creation of the actual 3.9 schedule until the change proposal is resolved.