Python LTS and maintenance cycles

TL;DR:
Support for most releases should be shorter (18 - 36 months) with periodic LTS releases supported for 10 to 12 years. This will allow package and tooling maintainers to focus support on a few specific releases. It would also allow users who are unable to move to new versions frequently to use current packaging and tooling.

The pace of development in CPython and the Python ecosystem over the last few years has been impressive. It’s great to see the language continually evolving. However, this does not come without issues, especially in spaces where keeping up with the recent versions is challenging or not possible.

I want to start with a little context. There’s an area of computer systems engineering focused on how to deploy, maintain, and operate operating systems at scale. There are some basic principles that make it difficult to quickly go to new versions of software. The biggest one here is package management. One of these principles says there is a single package management system and it’s at the system level. So, in the case of Python on Linux, you’ll be installing all your Python packages with a package manager like DNF. The reason for this is auditing. You need to be able to say where every file on the system came from in order to identify rogue software as well as software with known bugs and vulnerabilities. (In practice not all parts of the file system get restricted to known files. For example temporary, user files, and data are usually not restricted in this way, but also get restricted to non-executable parts of the filesystem and isolated from unrelated processes.) Efforts have been made to move this monitoring into the execution path; fapolicy with an RPM plugin is an example of that. Perhaps this sounds like overkill. For organizations that handle personal or sensitive data I’d say it’s fundamental. Keep in mind, the majority of hacks are not particularly sophisticated and take advantage of security holes that would not have existed if baseline security practices were employed.

There are also many use cases where Python is used in areas that are hard to update due to access or bandwidth limitations. Some examples include satellites, power substations, and remote sensors. In some of these cases software may never be updated after deployment. In other cases, there is a process to provide software updates, but major versions are locked to ensure compatibility and/or because of resource limitations.

So what does this mean in practice? There is going to be some variation, but I can give the example of moving to a new version of Python on Linux in an enterprise environment. In short, it means the Python runtime, any tools used, and any packages used need to be packaged into RPMs and tested. Any software that does package vendoring either needs to be unvendored or some automated monitoring configured to monitor it’s vendored packages for bugs and vulnerabilities. It’s a lot of work. Luckily, vendors like Red Hat and communities projects like EPEL do a lot of that work. However, much of that work is done by volunteers and the pace is slow.

Currently Python releases a new version every year and supports it for 5 years. Thanks to the efforts of many people, especially the release managers.

To get an idea of version adoption we can look at downloads for the Requests package from PyPI. This is not a direct proxy for use since it doesn’t cover installations through package management systems, PyPI clones, source, vendoring, or images. However, it is likely a good indicator of what is being tested against by CI and used on personal computers. My guess is actual use would skew older as most of the installation mechanisms above will have updating delays.

Requests download statistics for previous 30 days from pypinfo (October 17, 2024)

python_version percent download_count
3.10 19.95% 105,389,188
3.11 17.59% 92,887,955
3.7 16.23% 85,713,962
3.8 13.26% 70,046,673
3.9 13.00% 68,659,729
3.12 10.99% 58,025,183
3.6 7.32% 38,674,752
2.7 0.88% 4,654,680
3.13 0.73% 3,879,841
3.5 0.04% 199,131
3.4 0.00% 11,540
3.14 0.00% 4,209
3.3 0.00% 130
3.2 0.00% 3
2.6 0.00% 3
Total 528,146,979

A few things stand out.
- The most used Python version, 3.10, is 60% through it’s release cycle
- 37.7% of downloads were for unsupported Python versions

What problems does this introduce? Luckily, because Python rarely introduces breaking changes, compatibility is not at the top of the list. However there are two issues that stand out. The first is Security. While 3.8 received a security-related updated in March 2024, 3.7, which is still heavily used, did not receive an equivalent update. Luckily, there aren’t a lot of vulnerabilities identified in the Python runtime and standard library. Even when they are found, they are rarely major.

The second problem is an ecosystem problem. I maintain several Python packages. I usually have two criteria for dropping a Python version from support and CI: The Python version must not be included in a currently supported version of RHEL (This is a proxy for major distribution support) and the number of downloads from PyPI for that version of Python must be both a low percentage and a low absolute number. Because I’m supporting Python versions that are older than the officially supported versions, I often find ecosystem tools like tox, virtualenv, pip fail with older versions and need to be pinned in CI. Sometimes these can be pinned for a single Python version, sometimes they have to be pinned universally. Packaging is another issues as newer packaging methods do not work with older versions of Python. This keeps many projects from transitioning to the new methods.

So what can be done? I think that is twofold: tooling and maintenance cycles. On the tooling side, tools can be improved to where they can operate against versions of Python they aren’t running on. This would allow CI environments to use the newest tooling against older Python versions. This was discussed for PIP in 2018, but not implemented and then UV implemented similar functionality in Rust. But much of that tooling is outside the scope of what the PSF governs, so let’s look at maintenance cycles.

Currently, Python versions are released annually and maintained for 5 years initially with bug fixes, but this moves to security fixes as the version matures. I think supporting some releases for 10 - 12 years would be a better match to how thing are used in the real world. To try to reduce the support burden, non-LTS releases should have a shorter support window of 18 - 36 months. This would allow usage to coalesce around a small number of versions without slowing down the pace of development. Then package maintainers can support users unable to move to newer versions without having to support many versions.

There are two models for determining what releases to tag as LTS, Prescrived LTS and Ad Hoc LTS.

With prescribed LTS, an LTS release is done every x years, for example every 4 years and maintained for 12 years, or every 5 years and maintained for 10 years. The advantage is LTS releases can be predicted well in advance. The disadvantage is LTS releases may not line up with major OS releases. This is how the Ubuntu release model works with releases every 6 months, supported for 9 months, and LTS versions released every 2 years, supported for 10 years.

With ad hoc LTS, a release is selected as LTS based on it’s features and inclusion in LTS OS releases. This allows flexibility at the expense of some predictability. This is how the Linux Kernel is supported, with versions selected as LTS based on several factors and maintained for varying lengths of time. The Linux Kernel also has SLTS (Super Long Term Support) releases intended for industrial and civil infrastructure use cases.

I’m not sure which model would work best for Python and it may need some analysis. Perhaps it may be useful to let downstream OS vendors weigh in and use this as an opportunity to solicit financial and/or resources from these vendors to support this effort.

3 Likes

If we were to change the support model to include LTS releases with longer than usual lifetimes, this is probably the most important “human” aspect to address. I’m not sure how much folks realize the life commitment it takes to step up to be an RM. The closest anyone has gotten in practice to committing to be an LTS RM is @benjamin for 2.7 and I’m willing to bet even he didn’t realize what a long term commitment it would become[1]. Of course, Benjamin did a stellar job as 2.7 RM but I can only imagine the relief when it finally went EOL. Then again, with RMs today taking on two consecutive releases, it can probably feel pretty close to that.

Not that this can’t be solved, but what I think is different here than say a Linux distro LTS is that we’re all volunteers. Even folks who get paid to work on Python are still in a sense volunteers for the commitment it takes over many years. Maybe through the PSF and the DiR program, we can ensure continuity and actually being able to live up to any type of LTS release.


  1. Nobody expected 2.7 to effectively be an LTS ↩︎

3 Likes

Unless you’re really heavily coupled with Python’s internals, I don’t see any meaningful difference between a package supporting one 10 year old version and the last 10 versions of Python. Both would (as of now) block everyone from using f-strings, typing and most likely force them to abandon avoiding __file__ for resource loading. Both introduce a 10 year wait between a feature being introduced and said feature being usable. You could cut out some CI jobs but you can almost always approximate that already by just testing the oldest and latest supported Python versions and trusting the stuff in between to behave like one or the other.

4 Likes

10-12 year support should not be mistaken for anything short of a radical change.
Python 3.4 released just over 10 years ago. If it had 10 year support and everybody in mainstream packages supported it, then basically no packages would be able to use non-comment type hints yet.

Which is fine. We can think about radical changes… But as a result, I don’t think that changing the CPython lifecycle would actually have the desired result on security and support. If some subset of Python versions had a 10 year lifecycle, I expect that many libraries would choose not to support those versions past their “normal support window” (i.e. 5 years). Suddenly, the security situation looks a lot murkier. If CPython 3.6 were supported today, but cryptography had stopped publishing releases for it, would it be a net-net improvement?


All in all, this line of thinking sounds to me like it’s putting effort into the wrong place – organizations should be investing in being able to keep their software current, not trying to find a way to buy a 10 year lifecycle not only from CPython but also from the packaging ecosystem.


Almost 100% OT, but I’ve never trusted this practice for my projects. If I write

if sys.version_info < (3, 11): call_useful_library_func()

then I really ought to test on 3.10, not just on 3.9. And since any library I use can do that sort of version dispatch, I just assume I need to test each release.


I notice that the OP mentioned supporting older versions than the current CPython versions. I’ve faced issues with this as well. I supported Python 2.7 for as long as I could manage for some projects, and then basically catapulted forward to Python 3.6+.

But I again don’t think that the PSF or even PyPA (with its limited decision-making power in the ecosystem) can do very much on this front. If you try to support a very old version, like 2.6, today, you’ll find that you need to very carefully reconstruct what the world looked like circa 2010. PyPI index state is different, pip and setuptools were different, you might have trouble building that CPython version on a modern platform, etc etc.


Tool support for older Python versions can help, but that’s in the hands of tool maintainers.

(pip is a special case amongst tools, but I do not think it needs to support cross-version behaviors. Other tools which might invoke pip need to choose versions.)

9 Likes

Its likely that the version of python used is influenced strongly by the lastest version that OS/Distributions are supporting.

For example if you are using RHEL then you will be on an older python that is supported by RedHat.
Also the case for using Debian and Ubuntu LTS.

For Windows I expect the choice of python version is down to availability of the package you use for a version of python.

Also remember that if you get your python version with your OS then you will not see any downloads for it.

In fact is the download statistics you provide effectively only for Windows and macOS installation kits?

2 Likes

No, these are download statistics for requests · PyPI, the 4th most downloaded package per month: PyPI Download Stats.

2 Likes

I agree, As Stephen mentioned, not only does this put a lot of extra work on the (largely voluntary) CPython team and release managers (hi :wave:), it puts a lot of extra work on the (largely voluntary) package maintainers.

As a package maintainer, I would likely stick to the regular 5-year support window and ignore LTS.

There are also a number of important scientific packages that follow SPEC 0, which recommends an even shorter support window. For example, upstream Python has support for 3.9+, whereas SPEC 0 recommends 3.10+ (and 3.11+ during this quarter).

There are vendors like Red Hat who you can pay to extend support of CPython and packages. You can even pay Canonical and ActiveState if you’re still using Python 2.7! But I don’t recommend volunteers do so.

By all means, if you want to support EOL versions, because vendors still ship them, go ahead. But I don’t think it’s reasonable to expect other volunteers to support something for 12 years. If vendors want to ship and support EOL Pythons, they can also support packages for EOL Pythons.

I think having a 10-12 year LTS would also add extra confusion for users: does this package support LTS or only regular releases? This would put extra pressure on volunteer maintainers, many of whom are solo maintainers.

A note about these Requests numbers. A big chunk of the 3.7 numbers come from Amazon Linux for some reason. This may or may not be important when considering (volunteer) support.

system_name distro_name python_version percent download_count
Linux Amazon Linux 3.7 14.03% 63,017,481
Linux Ubuntu 3.10 11.13% 49,985,469
Linux Debian GNU/Linux 3.11 9.60% 43,102,055
Linux Ubuntu 3.8 6.49% 29,162,321
Linux Ubuntu 3.11 5.90% 26,484,782
Linux Ubuntu 3.9 5.00% 22,441,261
Linux Ubuntu 3.12 4.90% 22,001,013
Linux Debian GNU/Linux 3.10 4.88% 21,925,314
Linux Amazon Linux 3.8 4.78% 21,476,806
Linux Debian GNU/Linux 3.12 4.76% 21,378,037
Linux Amazon Linux 3.10 4.23% 19,005,439
Linux Ubuntu 3.6 4.18% 18,752,293
Linux Debian GNU/Linux 3.9 4.15% 18,661,042
Linux CentOS Linux 3.6 2.97% 13,335,076
Linux Amazon Linux 3.9 2.85% 12,812,280
Linux Ubuntu 3.7 2.81% 12,632,288
Linux Debian GNU/Linux 3.8 2.50% 11,210,178
Linux Debian GNU/Linux 3.7 1.86% 8,334,082
Linux Red Hat Enterprise Linux 3.9 1.66% 7,436,039
Linux Red Hat Enterprise Linux 3.10 1.33% 5,983,339
Total 449,136,595

(pypinfo --days 30 --limit 20 --percent --markdown requests system distro pyversion)

And 3.8 has only just gone EOL, and the 3.6-3.8 numbers are all decreasing:

As is the case more generally across all of PyPI:

Finally, PyPI numbers include a lot of CI and mirroring. The results of the most recent Python Developers Survey (carried out in 2023 when 3.8+ were supported) shows only 6% of human respondents used EOL 3.7 and below:

(Please fill out the 2024 survey.)

11 Likes

You would only need to get requests from PyPI when your distro does not package it. It’s packaged for Fedora, rhel, ubuntu and debian so you may not see downloads from those users.

2 Likes

I think I misunderstood you, I thought you assumed these were statistics for Python itself.

That was indeed what I assumed. Thanks for making the methodology clear,

3 Likes

Just for curiosity (I don’t own a satellite :wink: ):
How would LTS help here? I’d gather that a minor update (say, 3.4.19) with security patches would require bandwidth of the same order of magnitude as a recent update (e.g., 3.12.7). For compatibility testing, an equivalent, more accessible QA / preproduction environment “on the ground” would be used, I assume.
So what’s the added value of LTS here?

For an OS, it may differ: operation stops while updating the OS. But the satellite could continue to function while upgrading Python (if necessary with the old and new Pythons installed in parallel).

1 Like

The magnitude is very different between a security update and and a major version change because, in bandwidth-limited situations, you only send the delta of the changes. Outside of that there is recertification. Typically, for bug and security fixes you can use an abbreviated certification process where you just have to verify the changes didn’t negatively impact the operational and security profile of a device. If you start changing major version numbers, you’ve now introduced functional changes and potential compatibility issues. That requires more extensive integration, testing, and documentation and is not always practical once a device is deployed.

2 Likes

Something else occurs to me related to Python LTS’s, in relation to PEP 759 (externally hosted wheels). Having Python releases live for 10+ years means that PyPI packages supporting those LTS’s likely need to live on for that long. That’s probably okay for many packages, although even for the small number I maintain, I can say without doubt that I wouldn’t keep Python 3.4 support or branches around to backport fixes and new features. I’m actually quite happy to drop support for EOL’d release about the time I add (official) support for newly released versions.

But for really large binary packages supported by orgs with the resources to do it, it could still be a real problem. These packages often bump up against PyPI quotas and today, many times old releases have to be deleted in order to stay under quota. You can’t do that for LTS compatible versions, which could mean that such packages would be in a tight spot.

5 Likes

I assume that those bandwidth-limited contexts only rely on CPython, and PyPI packages are written off as impractical to include, unless vendored and bundled with the code which ships to the device.
Is that understanding correct?

As someone with 0 experience working on embedded systems, the whole context is foreign to me, so I’m wanting to learn, even if I don’t know how I can apply that knowledge or what sorts of recommendations I could make.

This feels like the very fact of CPython’s 5-year lifecycle makes it a poor fit for these contexts. Is that the impression you have as someone working in this field? Or is it used extensively in spite of this issue? (Are these people still using Python 2.7? I would assume yes, if you have a system which might have a 10 or 20 year lifecycle and can’t be upgraded until its completely decommissioned?)

Likewise! Very often by the time a release reaches EOL, the next release has had enough “bake time” that I can’t wait to move my support floor.
e.g. I didn’t give a hoot about the walrus operator when it was added, but by the time of 3.7 EOL, I realized how much I wanted it.

2 Likes

@barry The number of supported versions wouldn’t change. Right now I have 5 active releases. Even in the case a 12-year support cycle for LTS releases, you’d typically have 2 active LTS releases (3 every 4th year) and 2 to 3 interim releases depending on the support window for those.

I do like to cut out support for old versions, but I’ve spent so much time backporting packages, I don’t subject my users to that when there is still demand. Even when I can, it doesn’t buy me that much. Last year I went to move a package so the oldest supported version was 3.6. I thought I could finally move to modern packaging standards. But no, I couldn’t use project.toml with 3.6 because the tooling doesn’t support it. So I ended up leaving the older versions because dropping them was not a win for me and could impact someone else downstream.

@sirosen I haven’t worked on satellites directly, but I have worked on embedded systems, bandwidth-limited, access-limited, and highly regulated systems. Python is picked for the same reason it’s picked in other environments, because you can develop rapidly and it has a great ecosystem. Disk space is not usually much of a factor these days.

I think what’s missing here is empathy. People use Python in many different ways and no solution is going to work for everyone. But what we have today doesn’t reflect the reality of how people are using Python. Using the 5 year schedule to justify dropping support, whether for CPython or for packages, is just pushing the burden downstream. And saying “just upgrade” doesn’t help the guy who has no control over that and just has to make it work. And yes, it requires volunteers, but there are volunteers downstream too. It’s best to make positive changes as far up the stream as possible to reduce the net amount of work. Then we can collectively do more.

4 Likes

It’s all a trade-off, though. The vast, vast majority of Python users are upgrading well within the current support windows. Saying “just upgrade” doesn’t help someone who must use an EOL version for some reason, but saying “just support it” takes time away from supporting everyone else.

7 Likes

I respectfully disagree. I’m full of empathy towards that developer, sysadmin, etc. who has to work under that constraint.

I have much less empathy for a funded (for-profit or not) organization which wants (perhaps deceptively) large amounts of free community effort to keep the lights on.

And the distinction matters. I put pressure on the organizations to help the people who work there, who need to or have decided to work with Python.
If it’s hard to work on these issues, then orgs may need to increase staffing in order to deal with this problem space. That’s grossly oversimplifying, but it’s the basic form of my opinion.


As mostly an aside, I think there’s a bit of a hidden cost here, which is how hard it is to become a package maintainer. When I onboard new coworkers, we’re looking at supporting a pretty narrow range of versions with pretty similar features.

I don’t need to explain to them that “f-strings were added in 3.6, so we can’t use them for about 18 months more”, or similar minutiae related to the release history. f-strings are here. You can use them. Hooray! The landscape that a new developer sees is simplified in significant ways by having a narrower support window.

16 Likes

I can confidently say that pip will not be in a position to keep its codebase compatible with 12-year old versions of Python. So you’d have to use an older (unsupported) version of pip to manage packages on such an LTS version of Python. Sorry, but that’s the reality - we have a limited all-volunteer maintainer team, and recent Python features are a genuine benefit in maintaining pip on limited resources.

I agree that more empathy would be useful, but the empathy I see lacking is for maintainers who are supporting the Python ecosystem free of charge and in their own time.

Personally, I have a lot of empathy for people stuck in situations (usually organisations) where they cannot use an up to date version of Python. I’ve been in that situation myself. My experience is that it’s very frustrating, but the problem isn’t lack of support from Python, it’s lack of support from your own organisation. And much as I can sympathise with that situation, I can’t fix it.

17 Likes

My current personal assumptions are that by the time a release is security only, not just EOL, the majority of users should update unless their organization is managing those source code only releases. While I don’t frequently break prior to EOL, this is the window where if I see a compelling feature, I am open to raising the minimum on existing projects (new projects I always target the latest or next to be released python version)

recently, this was me deciding 3.12 minimum, as 3.11 is security only and I wanted a feature from 3.12 that significantly improves things for my users, the type statement.

Users who need 3.11 support can continue using the code as it existed with 3.11 support, but they aren’t the users I’m going to take extra time to support, they have intentionally decided “I need to keep functioning exactly like this”, and they can do that without me, that’s one of the benefits of open source.

2 Likes

Paul, I appreciate all the work that you do, but the Python ecosystem includes package maintainers too and I’ve seen multiple maintainers give up because they end up spending too much time working around tooling issues, many caused by the drop of support for EOL, but not particularly old versions. If there is a genuine reason for dropping support, then that’s something one can relate to, but if it’s just to keep with the EOL schedule or because walrus operators are cool, then it feels like the burden is just being pushed downstream.

The reality is the current support cycle is too short and it doesn’t have anything to do with companies not willing to upgrade. Let’s look at a best case scenario. If you’re running RHEL/Rocky/Alma, which in many cases is the only option given the requirements, you should, maybe, expect a new release every 3 years, but that’s not guaranteed.

RHEL 10 is expected in May 2025 and should ship with Python 3.12. While the version of Python RHEL ships with isn’t that important, the version EPEL packages are based on is. They will likely be based on 3.12 and are unlikely to be rebased. They used to be rebased when the RHEL release cycle was loger, but it’s a very difficult thing to do. For a secure system you are installing everything through the system package manager. So, if you’re not pulling from EPEL, you are rolling your own RPMs in house.

Typically, you wouldn’t consider RHEL production ready until a bit after the first minor release. That will likely be November 2025. You let it bake and perhaps start upgrading production systems in January 2026. This is a best case scenario, and the real date would be 6 to 12 months later. Maybe even longer if you require external certification of your systems. If CIS/NIST hasn’t published specs for security benchmarks you may not even be able to submit for certification.

Then you spend a year upgrading systems. Again, this is a best case scenario and this process could easily take 2 - 3 years, if not longer. Often it’s delayed by things outside of the organizations control, like waiting on software vendors to support the new OS release.

Then, if Red Hat keeps to it’s release cadence, you have RHEL 11 in May 2028. Ready for production January 2029. Best case migration completed January 2030.

So January 2030, in an unrealistic best case scenario, you have transition off Python 3.12, likely to 3.15. 3.12 is scheduled to go EOL 15 months before that in October 2028. That just doesn’t leave any room.

1 Like