Python LTS and maintenance cycles

How easy is it for people to find the python versions in use ?

For Android and browsers there are many popular dashboards we can use

for some maintainers it could be they find it too much work to dig this data out so it’s more a until I start using a new feature or “two years is enough” approach

Perhaps a simple “python version use” on PIP is enough

So we aren’t forcing people to support ten year old versions but we can help steer people towards agreering around ussing the same signals for support

Like web developers alaresdy have done with browser adoption, iOS and Android developers with OS versions etc etc

so far looks it’s a few things that would be desirable rather than LTS

  • Make it easier for developers to make informed trade offs that fit in line with Python adoption and reduce pain for end users
  • Make it easier to run code / tools for converting to older version (transpiling ? Feature detection ?) which reduce burden on maintainer / reduce pain for end user.
  • Pip auto sets minimum python package can be downloaded for based on detected features used
1 Like

I have to be blunt here. Repeatedly calling this decision making “arbitrary” is very grating as a library maintainer.

Let me pick one thing out of 3.11, one which I consider to be on the easy end of the difficulty spectrum, to highlight. tomllib.
I have CI matrix jobs which test on 3.11+, 3.10 (and below) with tomli, and 3.10 without tomli. I have codepaths which detect which of those three scenarios I’m in, in multiple projects. Some programs fail to run without toml support and some emit errors. Some warn but continue to function. All of this needs to be tested, as I said, which not only means CI run time cost, but also infrastructural cost to configure those scenarios.

It seems to me that you’re saying that these costs are negligible, and I should carry on doing tomli/tomllib dispatch for another decade. But the thing is… I don’t have to, just the way I don’t have to write from __future__ import with_statement anymore.
Each of these sorts of things is marginal in cost. But when you add them up, it really does make projects harder to maintain.

I don’t think anyone in this thread has been dismissive or unsympathetic, but they have consistently said “No.” Because we don’t seem to agree that this is zero cost or low cost. I don’t really know how to – or if we can – reconcile this difference of opinion, but please don’t call my opinion about good maintenance practice “arbitrary”.

22 Likes

To me I think there is something we can agree on. Both sides are basically agreeing that there is a mismatch on the release cadence

It’s just they are disagreeing on if it can be solved

Take 3.10 vs 3.11. It’s seems they were released a year apart but one is easier for the maintainers of libraries than the other

It looks like the OP doesn’t know if the version he’s using for his project is the recommended version nor if packages will continue to support it with security fixes even though 3.10 and 3.11 reach end of life a year after each other it seems like some packages will drop the less convenient version sooner

If say when they get around to upgrading their packages they could see which version is the recommended or most used version then at least that’s a signal they can use to go “if I use 3.11 which was much better with tomllib” then I’ll have a better experience than 3.10 which may just have been a few small tweaks here and there so is more likely to be dropped much sooner by maintainers

From the maintainers side you don’t want to have to maintain what you don’t need to which is a trade off. But since the signals aren’t always clear what you pick will vary and therefore will come across as abritaryto those who aren’t aware of the details behind that choice. This is in stark contrast to other environments where the community has agreed around using browser share, language major version or a specific OS year etc

A ten year LTS is probabaly the wrong tool for aligning these two POVs ot view other langauge manage it by bumping up the major version and coalescing big QOL changes for maintainers so instead of “should I adopt. Python 3.14, 3.14.1, 3.15, 3.16? Is it worth fixing any breaking changes / recertify”. It becomes okay it looks like people are transitioning to “Python 4”

This doesn’t seem like it fits pythons approach but perhaps better tooling / dashboards if most popular version on the website would help smooth things a little bit. It’s seems it was already quite difficult to dig this out and might be something we can all use as a metric on if we should support / use specific versions

This would give an extremely harmful signal to industry use of python. It’s been hard enough where I work ensuring that we update things in a timely manner. I’ve had to stop coworkers from blaming open source when the problem is they ignored that even our own scanning tools said the fault was in not upgrading.

While the idea of a long lived deployment sounds good on paper, for anything exposed to the internet, the reality is that updating must continue or you are putting security at risk, and that because maintainer time is not infinite, we cannot ask the open source community to support things for as long as for-profit entities do.

The only places I have had time to contribute back to open source has been in taking what I have seen fail in proprietary situations and try to improve the reason those failures occurred in languages and tools that I use. From my perspective, an LTS python version would only signal to companies I wish would contribute back more and have had to fight on doing the right thing back to being a drain on open source rather than coexisting harmoniously with the resources open source maintainers have.

10 Likes

I think this is the crux of the issue. The modus operandi of software development in general has shifted away from this mentality. It used to be that a new version was released when it had sufficient new functionality to justify an upgrade. Or, well, in practice, maybe sometimes the new functionality didn’t justify it, and then you’d skip that one. But nowadays it seems almost everyone has moved to a time-based release scheme where releases are made on a fixed schedule regardless of the functionality gain. And because of the interconnectedness of the software world, with different libraries depending on each other and ultimately on Python (or some version of whatever language they use), once some big players start to move in this direction, it creates an inertia that’s difficult for individual small libraries to oppose.

I share your frustration with this situation. Part of it is that I tend to think of programming languages, libraries, apps, etc. as tools. I don’t buy a new kitchen knife, stew pot, screwdriver, desk chair, or car battery every X number of years on a schedule; I get a new one when the old one wears out, or when I find a new one that seems to offer some legitimate benefit over the old. Why should software be any different? Well, the reason, of course, is that I can use just about any kitchen knife to cut meat for any stew pot, or use any screwdriver to unscrew any screw of appropriate size, or any desk chair to sit at to run any version of Python, and so on — but software is more and more interconnected so it can’t be used in isolation.

This means that keeping things compatible requires time and attention and effort. It means every stroke on the keyboard has to come with a thought: hmmm, I just typed async def or :=. . . when were those introduced again? Which came first? Do I really need to use that or is there another way that will still work on an older version.

And then on the producer side (that is, those who develop Python or whatever language or package), decisions likewise have to be made about when there is “enough new stuff” to justify a release. And users can become impatient when they know there is a new feature that is ready for release but a full release isn’t happening because it’s waiting on some other feature, etc. It means releases have to be “curated” to a greater extent than with a fixed schedule, where it can basically be “take everything that’s ready by this date and ship it”.

Unfortunately, it is cognitively easier for everyone to instead lump together the panoply of changes in various software components under the heading of time. We can think “it’s 2024, so I should be using the 2024 version of everything”. This doesn’t work 100% since not everything releases in lockstep, but it generally works well enough, and it relieves people of the burden of having to individually consider the versions of every dependency of their code. Also, the ubiquity of fast internet means that upgrading is “easy” in terms of the base operation; it still may take time to update your code and work out some kinks, but obtaining the new versions is trivial. (This isn’t always true, for instance, with embedded systems, and we do sometimes see comments here from people who work on such things and are stuck having to work with an old version.)

There is another aspect too, which is that “dropping support for version X” (of Python or some other dependency) is often something that a package maintainer doesn’t want to slip into a minor release, especially if “we support old versions” is a stated goal. But sometimes the shifting landscape of dependencies can result in an unexpected need to drop support, meaning that the downstream dependent needs to release a new version that doesn’t offer much new functionality, but just drops support for an old version of something else for maintenance reasons. This kind of undercuts the goal of only releasing when it’s “worth it”. Basically it’s hard to unilaterally decide to support old versions of stuff; you may find yourself on a more rapid upgrade path than you hoped, simply because your dependencies are leaving you behind.

This is one reason I call the situation “unfortunate”. I totally agree. And, unfortunately, that is the direction things seem to be moving. I don’t think Python is a big contributor to that, and in fact I think Python is a notable exception to that trend. But still things do move towards a situation where individual users and even small-time developers face an our-way-or-the-highway choice between going along with what “major players” do or shouldering an increasing burden of forging their own path. But, on the other hand, Python developers themselves have to shoulder that burden if they try to resist changes in external ecosystems (e.g., changes to security mechanisms), which is one reason you’re getting pushback to the tune of “we’re all volunteers”.

I find this sentiment somewhat disheartening. I mean, don’t misunderstand, it’s totally legitimate and valid, but it does seem to lean towards the view that @avylove seems to be lamenting, namely that software development serves the preferences of its makers rather than its users. Sure, there are lots of new features in various versions of Python, but that doesn’t mean everyone needs to use them. It’s only (or largely) because we’ve become accustomed to the constant upgrade cycle that we view it as “normal” to upgrade at every opportunity — and thus become reliant on the new features — rather than viewing it as normal to keep using an older version if it still works. It’s a “more flour more water” kind of situation where getting in the habit of upgrading gets us into the habit of feeling deprived if we’re not using the newest features immediately.[1]

If as @avylove suggested above, people kept developing with older versions as a matter of course, and only upgraded when they actually felt a specific need (like “this would be so much easier with dataclasses”) I don’t think they’d immediately “miss” the new features.

I’m not the one who said that, but if I understand @avylove right, perhaps an alternative term would be “automatically”. You’re describing a situation where you use tomllib and have a reason to drop support for versions that don’t have it, and I agree that’s not arbitrary, nor is it automatic. But the tenor of some other comments here suggests more that new versions should be adopted, like Mount Everest, “because they’re there”, without a specific basis grounded in the specific features provided by any particular version. No doubt in practice every version has something that somebody finds vital, but it’s still conceivable that each package could have a different upgrade pace and thus the average lowest-supported-version could be lower than it is now.

From a practical perspective, it sounds like some of what you want could be achieved without something as drastic as an LTS Python with 12-year support. Maybe it could be just a matter of keeping an eye on commits to the tools you rely on most, and volunteering to provide alternative implementations that don’t rely on newer Python features. Sometimes tool maintainers are resistant even to this, but I think in many cases there’s an important difference between “please change your code so I don’t have to upgrade” and “here’s an alternative PR that maintains functionality but also supports an older Python version”.

But they could still reject it. That’s the thing. I differ slightly with some of the conventional wisdom in that I don’t think it’s automatically wrong to ask people to support old versions for free. After all, they’re doing everything for free, what’s the difference? :slight_smile: But the thing is, even if we ask. . . they can still say no. :slight_smile: And it sounds like that’s what they’re saying.

Either way, I as a user always treasure developers who make an extra effort to maintain support for older versions of stuff, so I appreciate your posts on this matter.


  1. Personally, I don’t find many new features in recent Python versions super compelling. The walrus operator is a classic example of something that doesn’t really open up any new possibilities; it’s just a shortcut for things that could already be done quite easily. But that’s neither here nor there. ↩︎

4 Likes

Keep in mind that this is the response you get when your use-case is “I need government audited software shipped over a low-throughput connection to my spacecraft”.

For the wider variety of software systems out there, the compliance regimes they are subject to are much more lax. (Sometimes disturbingly so, but I digress…) The cost of upgrading is not so incredibly high for most of us.

“I need to be updating servers to a recent Python release every 3-5 years” is a known factor your organization should be considering when deciding to build mission-critical software in Python. And if that’s not acceptable, then you might want to consider using a different language – e.g., Java.

Yes, and please let’s include CI platforms and developer tools in our broad set of dependencies.

I maintained Python 2.7 support in one project until I reached a breaking point not in terms of ecosystem dependencies, but in terms of a degraded development experience. I was spending more time keeping the creaking infrastructure working than I was on the actual package contents.

FWIW, I think the OP is holding the right end of the stick in that if you want a longer support window in the ecosystem-wide tooling, you need to start from supported CPython versions. As far as I’m concerned, giving me a PR which makes my python3.8+ code nominally compatible with 3.3 requires that I choose one of the two possible paths:

  • accept the PR, start testing on 3.3, declare and maintain support for 3.3
  • reject the PR

Because I’m not about to declare that code works on 3.3 without testing safeguards to ensure that it continues to work on that version.

And, to be honest, I wouldn’t mind that sort of support burden in a large number of projects, if it weren’t for the fact that the whole ecosystem has left 3.3 way behind. (Also, I am using type checking, so…)

Indeed, I am doing the work for free – because it’s fun, because it’s a good resume item, because it exposes me to users whose use cases inspire me, etc etc – and I don’t really mind doing a marginally harder version of the work to give a user a better experience. But it’s not a marginal cost kind of thing. It’s a lot harder to support 3.3 or 3.4 than it is to support 3.9 .

9 Likes

But on other end I maintain software that regularly bumps the version of Python ever 3 to 5 years

And year I’ve had packages for some quite major libraries upgrade to 3.10 out of the blue when 3.9 hadn’t been out all that long for no reason that to adopt some new syntax

Unfortunately this upgrade want possible since there were environmental problems outside of my control preventing me upgrading from 3.9 right away.

But this also meant having to ignore security patches or fight getting 3.10 packed to work on 3.9. 3.11 hasn’t been as bad in comparison but there is certainly room to improve this situation

1 Like

It was suggested a few times above that it is not difficult to support older Python versions so I just want to note here that setuptools was closely coupled to the stdlib via distutils. I can imagine that the setuptools developers would not have taken the decision to drop support for Python 3.6 lightly but also that maintaining wide support for different Python versions was a real burden for them at the time that they made that decision. Some progress has been made on this now with distutils having been moved from stdlib to become an internal part of setuptools.

Also it was mentioned above that it would be better if some Python tooling like pip was not designed to run within the Python environment that it manages. There has been enough progress on standardising packaging in Python now that we do have a tool (uv) that is not implemented in Python at all and that can implement all of the various standards and can be used in place of much of the mentioned tooling. I don’t know whether uv supports Python versions as old as 3.6 though…

1 Like

Only 3.8+ is supported by uv: uv/pyproject.toml at 6ff674f5bfec40c4a8b14f55cfa7a2b55fa0e812 · astral-sh/uv · GitHub

I mean if I had some code that could be improved by using structural pattern matching I would just release a new version requiring it. The old version still exists and likely will as long as PyPI exists.

setuptools should be workable. It gained support for PEP 517 in v36.6 and dropped support for Python 3.6 in v51, so some version in that range should do well. The suggestion to try flit is also a good one. (You could also just not use new features, for the same reasons that you are trying to explain to people that they shouldn’t use new features)

I’ve had reasonable luck using per-environment lock files generated via pip-compile in tox and using install_command to avoid duplicating env’s.

The python-requires in uv isn’t super meaningful, since uv has first class cross-environment support. uv can target 3.7 environments and it would probably be pretty easy to add 3.6 support.

I’d like to turn this around. What is the value in the current maintenance cycle?

Again, I think the more productive use of this thread is solving the specific tooling issues, since those seem very tractable to me.

If you’d like to continue pursuing the idea of Python LTS, despite the negative consensus and close to zero opinions being changed in this thread, you should 1) find 1 core dev willing to sponsor a PEP, 2) write a PEP, 3) see what happens

7 Likes

more often than not that writing with the new syntax is a major improvement. Python doesn’t get much in the way of new syntax frequently, and keywords even less than that. If I upgrade for syntax, there’s usually a strong reason.

An example of this I ran into myself was that by upgrading to 3.12 for the type statement, and pairing that with use of GitHub - Sachaa-Thanasius/defer-imports: Lazy imports with regular syntax in pure Python. I could significantly restructure a library that has some inter-connected types without creating an unresolvable circular import or without having to resort to tag types that both import, or having to mess with type checking hacks. The library being more naturally structured makes maintaining it easier.

That is the version constraint if installing uv into a Python environment. The point I was making is that uv can be installed before having any Python installation and can manage Python environments from the outside. The question about whether uv supports Python 3.6 is more like:

  1. Can uv install packages into a 3.6 venv?
  2. Can uv download and install a CPython 3.6 PyBI to make a Python 3.6 environment?
1 Like

Yeah I suppose the issue I had is the maintainer or runtime didnt do a great job at communicating the requirements or when old versions would be dropped

Pip allowed me to install a version that would crash since the maintainer forgot to bump the package details

And since there isn’t a consistent convention on Python version upgrades across packages I basically had to willing choose to forgive security fixes at least for a little bit

These are likely not solvable but at least if it’s easier for maintainers to align on the information they use for when they upgrade their packages it will go a long way.

Perhaps for example if there was a page on pop for which is most popular python then it would be easier to see “okay we are way behind we need to upgrade”

Right now it feels like a Python version can be dropped by a package maintainer when it’s only two years ol. There are many reasons why we can’t upgrade until five years later

And perhaps if it’s not feasible and support needs to be dropped early. The page would serve as a gentle nudge to remember those who can’t upgrade early and make sure to handle making sure you pin your minimum version (it would be great if pip could scan for use of new features and auto pin the version )

1 Like

hmm, I don’t think pip should be responsible for this, and good CI should catch this as a problem already…

it’s not really possible to address this via just scanning the file, you do have to actually try and import it, because you can have custom module loaders that change what is valid syntax, as well as conditional imports.

I’m a PyPA member, a maintainer of pip, and the PEP-delegate for interoperability PEPs in the packaging area. As such, I think I can speak with some authority here - although the only truly authoritative statement I can make is that there is no group that can or will make statements on behalf of all (or even a significant subset of) packaging tools. The PyPA as it stands as a matter of policy has no authority to override individual projects’ decisions.

With that said, you’ve already had my opinions. Supporting non-EOL versions of Python under the current CPython support policy is a fair and reasonable level of commitment, but going beyond that is not.

Supporting 10 years worth of Python versions, whether by requiring all code to avoid Python features newer than 10 years old, or by maintaining multiple supported branches targetting different Python versions, is an unreasonable demand to place on tools - particularly as all the evidence shows that the significant majority of users is comfortable with current support levels.

I don’t know what you’re after in terms of “improving cooperation”. You’ve not, as far as I can see, offered anything yourself - all you’ve done is suggest that we increase our workload to address an issue that you have (and which you claim is not uncommon, although you seem unwilling to engage with our contrary experience to the effect that it actually is uncommon). That’s not what I would call “cooperation”. People have offered you help in finding ways of working with current tool support policies, but you’ve rejected that help. You say you’re not demanding anything, but it’s hard to see what you are doing in that case.

That’s roughly what pip’s support policy is based on. Our evidence is that supporting just non-EOL versions of Python is acceptable, based on usage numbers in the download statistics. We don’t drop support for a version immediately it goes EOL, but we’ve never had evidence of a need to keep support for a significant length of time. So we usually drop support as soon as we start using language features not available in the now-EOL version, and that’s typically enough time for usage figures to drop below our threshold.

I completely agree. It’s very hard to take your statements as being in good faith, when you consistently dismiss as “arbitrary” what are actually carefully discussed and evaluated support decisions - at least that’s the case for pip, and I have no reason to believe that any project maintainer treats dropping support for a Python version “arbitrarily”.

Just because you don’t like the decisions projects make, is no reason to insult the people making those decisions.

That should be “open source software development”.

Of course open source development serves the preferences of its makers. It’s a hobby. If you want me to do something that I don’t want to do, then pay me. Otherwise be glad that I get some level of enjoyment[1] from sharing things that I wrote that I think are cool.

There are so many articles on the web about this sort of topic that it’s hard to know where to begin. This is one that I like: Setting expectations for open source participation. But ultimately, the key fact is that even the most well-meaning “I think you should do X” misses the point that I don’t have to do anything that I don’t want to do - and pressuring me simply makes it more likely that I’ll stop even doing what I currently do.


  1. The amount of enjoyment rapidly goes down during conversations like this when people seem to think I owe them something :slightly_frowning_face: ↩︎

20 Likes

To expand on @pf_moore’s response, software development serves the preferences of who pays for it. If it’s made by volunteers, then the makers are paying for it with their time and effort.

That’s seems acceptable but it seems it’s way too hard to dig this information out. On the current homepage of website there is no easy to find link for even the EOL schedule, only the latest version, not even a link to the release history

So it sounds like making it easier to dig out this info for casual users would go a long way

Contract that to Node JS not only does it point you to the recommended version with a link to a version you can use for new features

But in the main menu it has a “supported versions” link

This may be the main reason behind the varying degrees of support by package maintainers which may feel arbitrary to package consumers

Perhaps we should be encouraging as much of ecosystem as we can to support all non-EOL python

But current website doesn’t communicate that effectively imho

Screenshots as samples




If we even look t Django download page this would be an improvement over pythons

Interestingly Django only does 1 year cycles with a 3 year LTS release with the shiny new breaking features / deeper refactors

Perhaps python could do same

2 Likes

The status of Python versions is easily accessible from the Dev Guide homepage:

Also, note that Python is not intended solely for developers. Its ease of use makes it accessible to the non-developer community as well.


I don’t have a good experience with other programming languages that have LTS. It’s similar to the experience with IE6[1], where you were forced to support it, but no one wanted to.


  1. IE6 support on all Windows versions ended, more than 14 years after its original release ↩︎

4 Likes

First time I even knew this existed which kind of demonstrates the problem :grinning: I don’t think “the information non python langauge developers may want to find is easily accessible on a page designed and sign posted for people who are Python language developers” is an adequate justification for the current setup. We should at very least have a link on main website to this page (the dev guide btw is only linked right at the bottom of the website in the footer in my browser so I wouldn’t say that’s “easy l)

In other languages it’s not on a separate page.

Agree though that we don’t want an IE6 situstion but just think current way of surfacing what versions python core developers think are important to support (I.e not EOL) or are a good idea to use isn’t ideal

5 Likes