Python LTS and maintenance cycles

I agree, but new packaging standards don’t even support 3.6, which isn’t that old.

I hear what your saying, but the discouraging part is tooling and the signal from the PSF that if you don’t have the luxury of running the latest and greatest we don’t care. So how do I take your advice? Do I just give up supporting Python all together?

How would it affect you? You’re already making a choice not to support all currently supported versions of Python. You’d still have that right.

pip’s generally one of the better ones, though I have had to pin it at times. Tox could make that better by allowing pinning pip and virtualenv by tox environment instead of universally. Where pip and others missed an opportunity is when the assumption was made that the current Python runtime is the target runtime. If I could use modern tooling with a recent Python version, but target a different Python version, that would go a long way in addressing the pain points. The reality is most of the pain is in CI and CI maintenance. As long as the output is something that can be installed on older Python versions the tools themselves don’t really need to be able to run on older versions.

I’ve been supporting software that worked across a decade or more of Python for a long time. I was glad when 2.6 was gone, because there were some basic things missing. 2.7 wasn’t bad except dealing with Unicode, but even that wasn’t that bad. Now, I try to maintain at least 3.6 for existing projects and intend to for another 4.5 years until RHEL 8 goes EOL. Maintaining 3.6+ is cake except for lack of tooling support. Sometimes I put in a walrus operator by mistake, but then my unit tests complain and I go back and change it.

I think the download statistics are a good starting point because that’s the data we have. The reality is they are a sample of data and the real numbers are going to skew older. Even if you look at the statistics I cited in the original post, 3.6 is still 7.32% of the downloads for Requests. Smaller than the others, but not insignificant. I think the RHEL support is a better proxy, so I use that with download statistics to determine which versions I support.

2 Likes

It sound to me that whilst python has in the past introduced a lot of downstream compatability libraries for new features using futures
Package

It’s lacking the same for the tooling, for example some of the py project features could easily be transformed into a requirements.txt

So perhaps it’s less about support of old versions but how could community at large at transpilatiom like features as we see in the JavaScript ecosystem

2 Likes

It seems consensus in this thread is that there’s relatively little appetite for Python LTS (and I agree with this consensus).

Maybe this thread could now be more productive if we focussed on specific features in tools or packaging you’d like to see? In the absence of specifics, it kind of sounds like clever use of pip-compile (or uv pip compile) + constraints files could take away a lot of your pain.

9 Likes

If everyone was still trying to support Python 3.6 in latest releases then maybe we wouldn’t have so many improvements. Supporting the old versions isn’t free and that cost has to be paid from somewhere.

I think this would be more productive. For example what does it mean that “new packaging standards don’t support 3.6”? There are many packaging standards and as far as I know none of them is tied to a particular Python version. The statement presumably refers to some particular tool or some particular standard but it is not specified which.

2 Likes

And existing python libraries can support EOL python versions, they have that right and have clearly decided that doing so isn’t in their interest.

I still remember libraries that ended up basically stuck in a non-progressing state due to the python 2.7/3.x divide (and this was worse than it should have been for other technical reasons, see all the libraries that never supported 3.0, 3.1, or 3.2 for actual technical reasons).

You’re basically re-inviting that to the whole ecosystem as intended, because that’s the kind of feature freeze it would require to support a supposed LTS python version, slowing down the rate of improvements for anyone interacting with parts of the ecosystem to avoid just pinning your dependencies on the version that works when you deploy your frozen in time system.

4 Likes

While it doesn’t always feel like it, supporting excessively old versions is helping to perpetuate fundamentally broken systems. Why would people be motivated to more actively engage with their own organisations to help get bad software management policies changed when the most obvious consequences of those poor policies are being suffered externally rather than internally?

All of the major commercial operating system vendors have realised that using 10 year old language runtimes is a bad idea (not just due to developer experience concerns, but also from a security risk management perspective), so they actively invest in providing supported ways to get access to newer versions, and so do the PSF and others:

  • while published by the PSF, the Windows store Python builds are created by a Microsoft engineer (on work time, as far as I know)
  • Red Hat invests heavily in Linux container support and other ways of running newer Python runtimes on older RHEL versions. Other Linux vendors (Canonical, SuSe) are similar.
  • one of the reasons conda became popular is because it explicitly supports user level installation without system administrator access, not only for Python (the way the python.org installers do), but also for the complex external dependencies of the Scientific Python stack

So my recommendation is to treat requests for support of excessively old versions as an “XY Problem”: rather than trying to make using excessively old versions easier (even if that is the specific request that we receive), we should be helping the folks that make those requests to identify the primary factors that are making them feel like the old versions of Python are the only option they available to them, and trying to solve that underlying problem rather than working around it.

If they are genuinely truly stuck with ancient versions, then we can teach them how to keep things superficially working (by pinning any dependencies that don’t correctly declare their minimum Python version), but also make sure they’re aware that is purely a stopgap measure, and the real solution is to help their organisation work out how to upgrade more frequently.

15 Likes

I’m not sure how pip-compile would solve anything. But just being able to use modern packaging standard with 3.6+ would go a long way. And not having to universally pin CI tools like virtualenv would make a difference too. Frankly, if tooling maintainers only dropped support for older versions when there was a technical reason instead of arbitrarily, that would make a big difference.

Can you point to something specific? For most use cases, there isn’t much difference in code for 3.6 and code for 3.13. The main thing would be some of the newer typing syntax, but that’s only if you are adding type annotations and using the latest syntax.

I’ve mentioned multiple times that pyproject.toml doesn’t work with 3.6

Why would there be a feature freeze? I supported many libraries through the 2/3 transition and rarely felt it held me back. Unicode was a pain, but, pretty easy to deal with. Regardless, supporting older versions is a choice and always will be. And so is how to do it. What I’m asking is the support model signal reality rather than an impossible aspiration. There is now and always will be bifurcation between those projects that can use the latest and greatest and those that can’t. That isn’t a problem, but only supporting some of the community is.

While it’s clear you have good intentions, I think your advice is missing the mark and coming off as condescending. And this is the really frustrating part of this. Someone goes looking for help and what they get is responses that their reality isn’t valid. As I illustrated previously, even in the case of Enterprise Linux (which is far from the most restrictive use case), if Red Hat keeps on a 3 year release cycle, under a best-case scenario, Python would need a 6.5 year maintenance cycle to cover it.

I’d like to turn this around. What is the value in the current maintenance cycle? There is definitely a benefit in have annual releases, but why support them all for the same amount of time. Why not have some supported for longer and some for a shorter length of time? It doesn’t have to be 12 years, but 5 years seems arbitrarily short for a language as foundational and pervasive as Python.

2 Likes

Your reality is certainly valid. I can’t speak for every packaging tool, but certainly with pip, and with the standards process, we do consider all the use cases we are aware of, including people stuck on older versions of tools (including Python itself).

What we don’t do, is blindly support all of those use cases. There are always trade-offs, and unfortunately, your situation is one that loses out. You have my sympathy over that, but what you don’t have, and what you aren’t entitled to, I’m afraid, is anyone willing to help you deal with the situation for free.

Nobody’s being condescending here. We’re trying, as sympathetically as we can, to make it clear to you that nobody is willing to help you solve your problem for free. There are paid options, which you don’t want to make use of (and that’s a perfectly legitimate choice), but that’s all.

We’ve explained this before. The maintenance cost of changing the maintenance cycle is sigificant. If you’re not willing to take the word of core developers, tool developers and release managers (all of whom have chimed in here) I don’t know how to convince you. In addition, there’s the cost of potentially losing developers who like working with the new features of Python, and aren’t motivated to give their time for free working on codebases that are artificially constrained to not use newer developer productivity features (for example, dataclasses, importlib.resources, tomllib).

14 Likes

The key problem to highlight for users affected by this is that old Python versions are only suitable for use cases where someone is putting in the work to ensure that specific Python version remains suitable for that use case.

The LTS Linux vendors do that work, but they do it for their specific use cases, not for running arbitrary Python workloads (this is why the RHEL system Python now lives in /usr/libexec).

Keeping a Python runtime version suitable for arbitrary use cases is a different task, and even our commercial redistributors generally don’t try to push that much beyond the community’s 5 year cut-off.

Running arbitrary user code in the system Python is not the right way to do things, and once you drop that constraint, even RHEL 8 offers newer Python runtimes than 3.6 (Checking the App Stream life cycle page, 3.12 has been available since May this year, 3.11 since May last year, and 3.9 is being retired next month).

7 Likes

I don’t remember seeing any previous reference to pyproject.toml. From the discourse search it looks like this is the first time pyproject.toml has been mentioned in this thread.

The pyproject.toml file has no Python version constraints itself. Presumably you mean that some package added support for pyproject.toml in some release that did not support Python 3.6.

1 Like

This is equivalent to demanding support for every version from the last LTS (10 versions!). A library cannot just pick 5 arbitrary versions to support (2 LTS, 3 new). A library either has to give up on all new features and keep their code in the lowest supported LTS version or maintain two different code bases. Both of these options are unrealistic. If anything this would make me only support the non LTS versions, reducing the total number of python versions supported from 5 to just 3.

10 Likes

why support them all for the same amount of time. Why not have some supported for longer and some for a shorter length of time?

Perhaps one aspect of stable branch maintenance on a very large and active project that may be non-obvious to those who haven’t had to do it themselves: The primary challenge stems from the vintage of the branches rather than their total count. When you need to backport a fix from newer branches to older ones, the greater the timespan between them the greater their divergence, and the harder backporting becomes. Also older versions need older dependencies and toolchains contemporary with their initial release, which means more work maintaining aging test infrastructure and such.

Supporting 3.6 plus 3.11 through 3.13 would likely be almost as much work as supporting every version between 3.6 and 3.13 inclusive. Dropping support for intermediate branches might reduce the mechanical/process burden slightly, but the overall effort involved not is not eased by that nearly as much as you might think.

6 Likes

So one aspect I can maybe shed some light on is that even with interpreted languages there are the runtime VS the build time requirement

I worked at a startup that used MIPS based OpenWRT routers as rudimentary IOT devices. Not only did they have this less common CPU architecture but the ROM had limited storage requiring more optimised packages (handled by OpenWRT) to be built and tools like python to support them

This often lead to a delay with new python versions being avaliable as like we see with Android and mobile phone vendors they needed to be adapted to this specific context and tested that it functioned as intended with so much of the libraries it relied on having been changed.

Our company didn’t have the internal resources to do this so we did outsourced this to the wider community but would have been willing to pay if the process was clear and documented.

But even if we had had this, there would have been a period where we coulduse certain packages whilst we wait for this work to be done. These packages could have critical security fixes that weren’t back ported for whatever reason

This is a place where the code vs build requirements become stark. Some of the new python features may not have needed meaningful changes to pythons byte code or could potentially have been emulated through being transpiled to existing constructs in the old language as is often done in JavaScript (with worse performance)

If there exists a method of doing this, it would open door for community to push the langauge forward but an official way of us getting new code to run on old interpreters. No one ever accuses JavaScript of slow innovation. JavaScript has shown the tooling that does this could exist outside of the language allowing the implementation to stay simple

I now expect the next thing I’ll be asked is if these companies need it so much why don’t they build it. And I guess that’s a fair point but my question would be what would be the core recommend in these cases (transpiring using AST or bytecode level?)

Again I’m not a great developer I’ve just had to try and solve the problems my company had with the tools I had to hand. And I’m sure there are many very good reasons why this is not an ideal situation but unfortunately not all orgs have the knowledge or talent to execute it the ideal way

Even just having documentation pointing these people to how to solve these problems without creating a burden I think would go a long way

4 Likes

First, like you, I am a volunteer. this is not my problem, this a problem I am empathetic to. It’s a problem that affects many people downstream and, really, affects us all. Why? Because the majority of the organizations that work under these constraints operate important components of our society: infrastructure, health care, finance, government, education. Sometimes they have deep pockets and sometimes they too are relying on volunteers. If our response is, “pay someone” or “solve it yourself”, we’ll reach a point where only the Microsofts and Googles of the world run everything.

Secondly, what are these paid options? Yes, Red Hat maintains older versions of the Python runtime. But they don’t package PyPI packages for every versions and they don’t do anything to improve the tooling.

It’s frustrating because I feel like described the problem set with a lot of detail and provided a possible solution. But, rather than getting ideas or clarifying questions, the majority of the responses have been along the lines of “your problem is not valid”. Now you personally have said something more along the lines of “We don’t want to and here’s why”, which is a valid response even if it’s not encouraging.

Yes, other versions of Python are available in Enterprise Linux, but only a few core packages get repackaged for those versions. Many organizations do not have the resources to repackage all the dependent Python packages. And going back to what I expained in the original post, having everything in a system packages is a basic requirement. Even if that is done, depending on the legal requirements, a system may need to be recertified if a runtime changed.

I apologize. I thought I mentioned it, but it seems I just talked about modern packaging generally

Yes, I don’t remember the specifics now, but I went to move a project that supported 2.7+ to 3.6+ thinking I could use pyproject.toml, which would be a win for me as a maintainer, but I couldn’t so I didn’t end up dropping any versions because it didn’t benefit me and may create a problem for someone else. I mean I could get rid of a few lines of code, but nothing that would make it easier to maintain or functionally more efficient.

No one is demanding anything. This is about signaling. If we did adopt an LTS version it would only mean that we would apply security patches to it. Red Hat and others already do this, but it would definitely be better if that was a centralized effort. The main thing here is it would signal that there are people depending on older Python versions. As a package maintainer, you can ignore that signal, or not.

As someone who has spent way to much time doing backports, fewer branches definitely makes a difference. Yes, sometimes being able to step through iterations helps, but often it doesn’t. What I’ve found is, often, when you backport a bug fix, you find, if the bug applies to the target version, the code has changed little since that version, though sometimes it’s been moved or renamed. But this is already happening today. As I mentioned above, Red Hat and others already maintain older Python runtimes. It would be good to coordinate that effort and have better support in the tooling.

I definitely think there are multiple business opportunities where one could help the community and provide a service to businesses. The problem is a lot of the people who have the technical ability actively avoid the business side of things. It would be nice if there was an incubator that paired business types with technical types for Open Source related businesses.

3 Likes

And this signal is problematic. If I drop 3.8 now, I have the backing of the “official” source. If said official source claimed 10 years of support and the library maintainer did not, they are bound to get not-so-nice issues on their projects about how they should be supporting everything from the last 10 years because the official source said so.

8 Likes

I’m assuming that it is setuptools. The original pyproject.toml PEP (PEP 517) was accepted around the same time that CPython 3.6 was released (~2016). However setuptools did not add support for pyproject.toml until some time later. The original purpose of pyproject.toml was to be able to use alternatives to setuptools such as flit that was around at the time and more recently others like hatch, meson-python etc. I expect that you could use pyproject.toml if you use an alternative to setuptools like flit (I mention flit specifically because it was around at the time and is mentioned in the PEP whereas newer backends like hatch may not have ever supported 3.6).

2 Likes

I agree with you. If 10+ years old Python is “officially supported”, many Python users may try to mix 10+ years old Python + 10+ years old library + 1-day old library. And they might report issues to maintainers when it doesn’t work.

That is too strong signal. It will confuse many Python users and lead maintainers burnout.

P.S. I am not a maintainer of packaging tool nor PyPA member. How about asking packaging tools to support Python versions in EPEL?

7 Likes

There are some maintainers today that don’t support even the current 5 years. As a package maintainer that’s your choice, but you should document your support policy. If you get an issue like that, you reference your support policy. Really though, there’s no reason to arbitrarily drop support for a Python version because it’s no longer under maintenance it’s no cost to you if it just works. When that version breaks in your CI, then you decide how much if any effort you want to put into it. If you don’t have that sort of coverage in your CI, then your software is still in a pre-release state anyway.

Yes, it would have been setuptools. That’s usually what I use because it’s always there. I’ve thought about seeing if another build tool will work. I need to look and see what build tools are available for building RPMS because it won’t help me if I can’t also use it there. Outside of that I looked at Hatch, but it tries to do way to many things, which is also the problem with Poetry (along with a lot of other issues). I have not looked at meson-python or flit. It looks like an older version of flit is available in EPEL, so that may work. Another option I thought about was generating a setup.py from the pyproject.toml prior to building RPMs. It’s not ideal, but workable.

This still doesn’t solve universally pinning dependencies in Tox. I think it’s less of a problem in Nox, but Nox is still missing parallelism so I only use it for smaller projects. I understand some people have moved to Hatch, but again, it seems to be trying to do too many things.

All of this goes back to the heart of the problem. I spend too much time working around tooling rather than fixing bugs and adding features to the packages I maintain. Part of that points to the tooling renaissance Python has been experiencing, which is a good thing.

Then don’t call it “officially supported”. Can it something else that sends the signal for what it is trying to accomplish. I’m also not sure support would be that big of an issue. If your package metadata is accurate, pip won’t pull it down for older releases. That’s the way it woks today.

I think there are some PyPA members here. I’d be interested to hear their thoughts on improving the cooperation there. Generally there’s the Python community, the Fedora/EPEL community, and a few people in the middle trying to keep everything working. There is a Python SIG that has made a lot of improvments to Python RPM packaging over the years. A moonshot goal is Python package RPMs could be built directly from source rather than needing a Spec file, but, in the past that’s only been possible with very simple packages as there’s a lot of variation in tooling. In theory, some additional metadata could be added to pyproject.toml to facilitate this.

2 Likes

If Python 3.7 still had support, I wouldn’t have started using the walrus operator since June 6, 2023. Supporting two different language versions is cumbersome. Today, I write Python 3.9-compatible code and have many TODOs for Py 3.10 and Py 3.11 to enhance the code in the future.

Upgrading from Python 3.9 to Python 3.10 is simple, so I don’t need to maintain two different versions.

With LTS, I would have to maintain all LTS versions, because waiting 10 years to use a feature or enhance your code is too long.

If the free-threading implementation is successful, I will drop support for all other Python versions and use Python 3.14 exclusively. Think of it like the Python 2 to 3 transition era. But that wouldn’t be possible with LTS in place.

5 Likes

There is a cost. Every single python version has brought with it features that makes the language better and allows refactors that improve the health of the codebase. There’s a cost to supporting any number of old versions and that’s the fact that your code limited to the lowest supported version. This is a heavy price to pay. Not being able to use a decade worth’s of improvements will make libraries a chore to maintain and at worst cause the maintainer to either not support LTS at all or just abandon it and move on (either scenario will end up worse than the status quo). Dictionaries guaranteeing insertion order and dataclasses were added in 3.7, both of which codebases today couldn’t use if they had to support LTS releases.

Not to mention this will introduce several more code paths for various python versions and much more complex test (and CI) configurations.

Take the recent wheel 2.0 discussions for an example. Having zstd in the minimum supported stdlib is crucial to get better compression but currently that’s atleast a 5 year wait and adding LTS to the mix will delay that and several more crucial improvements to python and related things by a decade.

9 Likes