Requires-Python upper limits

Requires-Python upper limits

Requires-Python was added to allow older versions of Python to be dropped by packages without breaking installation on older versions of Python. Currently (and for the last 4+ years), pip handles this quite simply; Requires-Python is a free-form SpecifierSet, and it checks to see if the current version of Python is included in the set. If not, it starts going back through a package’s history to find the most recent passing version. This was prompted by IPython dropping Python 2 - 3.3, IIRC, and has accelerated dropping older Python versions like 2.7, 3.5, and now 3.6 and even 3.7 for the data science community (NEP 29). This was designed for and supports lower limits very well.

However, a growing number of packages are starting to use this for upper limits as well. The official specification (PEP 345 and PyPA Core Metadata specification) both state:

This field specifies the Python version(s) that the distribution is guaranteed to be compatible with.

This is extremely problematic; since you cannot “guarantee” compatibility with any future version of Python, every package following the specification seems to be required to add an upper cap. Besides my opinion that this is a horrible practice, the fact that Python itself expects most packages that support the current version without warnings to continue to work for at least the next two versions, and the fact this would seriously impact the already slow support for new Python versions (see the pycln disaster that brought down pybind11, cibuildwheel, and others simply because it had an artificial cap on Python <3.10), this is absolutely not supported by Pip correctly.

Take Numba as an example. Out of any package that I know, this has the hardest cap on Python version possible. It uses bytecode details, which are declared internal (and likely to be changing quite a bit in the optimization focused changes coming in the next few Python versions!). They always have to adapt to new Python versions. So, in version 0.52, they started adding an upper bound to the Requires-Python metadata slot. They also still do the “correct” way to get a nice error message; they add a check and error to setup.py. So, if you pip install numba on Python 3.10 (a 3.10 compatible 0.55 is about to be released, so you might need pip install numba<0.55 to get this example to work soon), then without explanation pip downloads 0.51, then crashes trying to compile a pinned dependency with no 3.10 wheels (llvmlite). No nice message, and it downloads some seemingly random older version. The extra metadata made this worse; dropping this would have caused the more recent, Python 3.10 compatible llvmlite to be used, would have grabbed numba 0.54, and would have produced the more useful error message.

Edit: I just realized via testing that PDM (and probably Poetry?) use backsolves here too. So using requires-python = ">=3.8" on a new PDM project that has a single dependency will get numba 0.51, since it thinks that version supports “all” future Python versions, while 0.54 doesn’t support 3.10+!

Other packages are adopting this too; SciPy is the most recent (and the reason I’m discussing this here). In the full discussion, they are not interested in working with the current implementation in Pip, but want the metadata to be “better” and are following the PEP/description, and adding an old “last” release without the cap that will break when used from any newer Python. This workaround has issues (why download some old version? If there are any dependencies in the future that break, this might not be the first point of failure, etc.) Another workaround based on a manylinux1 idea would be to add Python 3.n+1 wheels as soon as they are allowed (ideally after checking, just in case it is compatible with Python 3.n+1!); these wheels would be empty and would just have a single dependency on some python-not-supported==3.n+1 SDist package that breaks with a message about that version of Python not being supported.

Another major factor (possibly the driving factor already) is that Poetry pushes upper limits on everything, including Python, very hard - it defaults to python = ^3.x, and the authors even suggest capping at the last tested version. While I’d love for this to change, I’m not hopeful.

Hopefully I’ve shown the current situation is a mess. I’d like to propose three possible solutions that would bring the pip implementation and recommendations in sync, and provide a better path for packages.

I should point out that implementation for all three of these suggestions requires a new feature in packaging’s SpecifierSet: specset.overlap(spec). This would compute the analytic overlap of a set with a single specifier - it would return a falsy value if there is no overlap. The True value is unimportant for this use, it could just be a boolean, or it could be the intersection SpecifierSet. I’ve already wanted this for cibuildwheel; asking requires_python.overlap("3.9.*") would let cibuildwheel know if it needs to build Python 3.9 wheels. For the below ideas, requires_python.overlap(">{current_python_version}") would allow tools like Pip or setuptools the necessary information to detect upper caps.

I also should point out that Requires-Python was added to fix broken solves. Using it as an upper cap is trying to break working solves. This is very different, and something that is not supported for other things - there is no Requires-Architecture or similar. If you are on Python 3.10 and trying to use a package that says <3.10, then there is no “different” working version. If you allow the solve anyway, then it will either break anyway if the author was correct (though possibly later at runtime), or work correctly if they were just assuming it might not work. Unlike other limits, solvers can’t downgrade the Python version - that’s in the user’s hands.

Finally, locking package managers currently all (Poetry, PDM at least) force the Requires-Python metadata to match the lock file. This means that if you depend on scipy, for example, you will be forced to add scipy’s version cap to your metadata, even if you don’t cap scipy. This is due to the fact the lock file will be “invalid” on the newer Python version. Even though your library might be absolutely fine upgrading. This is really their problem for tying these two settings into one, but still something to keep in mind.


Idea 1: Avoid upper capping

This is the smallest code change, as it really is just standardizing the current way that Pip works. This would change the wording of PEP 345 and the spec to mention that Requires-Python is not for limiting upgrades to newer Python versions. Something like:

This field specifies the oldest Python version(s) that the distribution is known to be compatible with.

A warning could be added to setuptools, flit, and twine (and suggested for other tools) if an upper cap was detected. This is not what Requires-Python is for. We already have a system for indicating if a package was tested with a specific Python version - Trove classifiers. Enforcing limits will impede Python upgrades. This is also the nicest solution for lock files with the current implementation of locking package managers, since they currently don’t handle upper bounds correctly for libraries.

Idea 2: Implement upper capping properly

This is a larger code change, but is likely the closest to what users are currently expecting. Pip (and suggested for other solvers) would immediately error out if an upper bound is detected; that is, if all
future versions of Python are also excluded by Requires-Python. I believe Poetry and PDM may already do this.

This still requires changing the PEP 345 and such wording, as adding an upper limit is still a very bad idea for most packages, and would still impair the now yearly Python upgrade process. It also makes it much harder to test alpha/beta/RC’s, which I assume some of the packages doing this may not really be doing. The wording probably should be something like this:

This field specifies the Python version(s) that the distribution is known to be compatible with.

This would indicate that adding an upper cap should only be done if you know the upcoming Python release will break your code. Users will misuse this (as Poetry suggests) and will cap even if they are not sure the next version of Python will break. But they are doing that today too, and this would at least provide the right error message.

Idea 3: Ignore upper capping

This is the only solution that doesn’t require updating the text of PEP 345. The code change is nearly identical to the above solution; Pip (and suggested for other solvers) would ignore the back-search if an upper bound is detected; that is, if all
future versions of Python are also excluded by Requires-Python.

The nice thing (depending on your view) is that this makes the Trove classifiers and Requires-Python identical, so it makes it easier to automatically generate them from Requires-Python. Adding an upper bound would have no downside (as mentioned above, upper caps never fix a solve), so you could use this to actually specify what you are guaranteed to be compatible with without breaking what you might be compatible with.

The few packages that really are incompatible with a future version of Python could do exactly what they are trying to do today, and add an error to setup.py if a newer version is detected. Different build backend could provide ways to error out - in fact, tools like Poetry could use this custom setting to drive the locking instead of the Requires-Python field. Unfortunately, if Poetry, PDM, and others do not follow this ignoring of the field, this could make them harder to use if more packages start adding upper caps.


Option one formalizes the current Pip behavior. Option two gives people what they think they want, even if it is often destructive. Option three minimizes the chances of things failing when they don’t need to, and has some redundancy with Trove classifiers. Personally, I don’t have a strong preference. What do you think? Is there another option I missed?

Closest discussion I could find was https://discuss.python.org/t/use-of-less-than-next-major-version-e-g-4-in-python-requires-setup-py.

21 Likes

+1 from me for idea 1. I believe that having an upper bound on the Python version is the wrong thing to do and we shouldn’t support it or make it easier for people to do it.

If there’s a genuine situation where adding a cap to the Python version is believed to be the correct solution, someone should present the underlying requirement and we can look at working out a means of supporting that, which doesn’t use Requires-Python with an upper bound.

6 Likes

One extra data point: adding an upper bound breaks solving in both Poetry and PDM in a really bad way. If you add a basic file like this:

[project]
name = ""
version = ""
dependencies = [
  "numba",
]
requires-python = ">=3.8"

[build-system]
requires = ["flit_core"]
build-backend = "flit_core.buildapi"

(Or the matching, non-PEP 621 version for Poetry) (using numba since SciPy currently supports 3.10) then PDM/Poetry will back solve to the last version that “supports” all future versions of Python (the last one without an upper limit, 0.51), and will lock that. Which of course breaks, even if you are currently running on Python 3.9! I guess it’s nice that it doesn’t force the user to cap in that case, at least? :wink:

Every single version ever released without a cap has to be yanked to fix this. Or the caps can be removed and it then resolves the latest versions as expected.

PS: I love that PEP 621 allows you to use other PEP 621 backends with PDM! I so wish Poetry would support this!

1 Like

And this is the key reason why I consider upper bounds on Python versions to be broken. There’s no good transition method to add a cap. A reasonable solution to the underlying issue people are trying to solve with caps (whatever that issue is) would have a proper transition mechanism.

4 Likes

@rgommers ^ FYI, in case you missed it on scipy/scipy#14738

In the original post, Numba already has an existing requirement (although it sounds more like a requirement to an upper bound of the version of CPython)

Edit: Numba’s (resolved) Python 3.9 issue: Python 3.9 Support · Issue #6345 · numba/numba · GitHub

I’m just a common Python developer, but this is an issue I’ve struggled with too and gone back and forth on for both Python and dependency versions, for my packages and others, so I appreciate the through and detailed post and consideration of options to resolve this. To me, the central issue is that without a way to retroactively declare that past versions of a package are or are not compatible with a certain Python or dependency version, we’re stuck having to guess and either apply an upper bound, or not.

Even assuming we do from the very first released distribution (for new projects), this still means that we block otherwise perfectly viable resolutions if it turns out newer versions of Python (or deps) are compatible, and cannot later go back and update that while frequently releasing new versions just to bump version specifiers once tested. On the other hand, if we don’t add an upper bound and a version does break, we must also release a new version, and there’s equally no way to mark old versions as incompatible. Unfortunately, as you explain in the linked issue, with the current architecture this is an extremely involved process to fix, so it seems its more worthwhile to mitigate it instead, as best we can, via the suggestions here.

We would need to make sure the wording was clear that specifying lower bounds is okay, and upper bounds aren’t, to avoid any misinterpretation as to prohibit specifying the minimum for each major version, minor version, etc. Perhaps something involving “known version” or inverting the statement? We can bikeshed on this later if we choose to go this route, just wanted to point out that we’d really need clarity here with the final statement.

To clarify, would the error be if any upper bound was used in this field, even if the current Python version is included in the SpecifierSet, only if an upper bound was detected and the current version of Python is not within the SpecifierSet, or most specifically, if and only if the upper bound was not satisfied by the current Python version? Of these three sub-options, the latter, I think, would be the most useful and least disruptive to packages that do use it, but possibly require more complex parsing of this field.

Or, really, shouldn’t this say the inverse:

This field specifies the Python version(s) that the distribution is not known to be incompatible with.

Otherwise, it essentially says the same thing as the current version, in different words, as it generally cannot be known that the distribution will be compatible with any and all future Python versions, so we’re stuck with the same issue?

1 Like

While the trove specifiers do help mitigate this problem, I don’t see a fundamental difference between newly added caps on the requires-python field and newly added caps on the dependencies listing. Python is just one more dependency to be solved for. In both cases, the resolver will backtrack to the version just before the cap was added. So I am in favor of solution 3 to not back-search at all if version capping is detected in either the dependencies field or the requires-python field.

The few packages that really are incompatible with a future version of Python

I disagree with the sentiment here. Adjusting to a new version of python requires work. Look how long it is taking for conda-forge to complete the Python310 migration and how many packages were/are still broken.

In the original post, Numba already has an existing requirement (although it sounds more like a requirement to an upper bound of the version of CPython)

I almost elaborated on that - I believe @pf_moore is not saying that no package has a cap on Python version, I think he is saying adding a cap on Python version doesn’t solve anything - it’s not really a solution to a problem. You can’t “fix” anything by capping Python. You can just break solves. (This is different in conda, where the Python version can be part of the solve, and can be rolled back if it fails). The benefit is a slightly more focused, nicer error message, but you can also do this with an error in setup.py, with the binary upload trick, etc.

Even assuming we do from the very first released distribution

Just to add, if you get it wrong, say if you go from ^4.0 (Poetry’s default cap) to <3.10, it’s too late, the old packages are broken and will break solves for anyone who uses anything that includes 3.10 or newer. If you start by capping every single version, that’s “better”, but then it causes all sorts of issues upgrading (every dependency has to manually update this field, and rerelease, linearly up the chain, even though Python carefully avoids breaking changes!).

We can bikeshed on this later if we choose to go this route

My examples were by no means final, just showing the basic idea.

and only if the upper bound was not satisfied by the current Python version

What do you mean by “the current version of Python”? I don’t think it makes any sense if the “current” Python is the one running, so I guess you mean the latest Python available? It would actually be cute if this was time-dependent; that is, upper limits were allowed around the time of the next Python (that’s now date dependent, after all) - but due to the issues above, this would not be sufficient. Even though Numba’s <3.10 is absolutely “correct” as far as conceptual metadata, it is also wrong - it can break a solve on 3.9 for 3.9 on either locking package manager because it will grab an old version of numba in order to make a lock file that allows Python 3.10+ (at least it thinks it does).

The idea would be all upper limits would be disallowed. You’d have to have at least one >= or >. (For option 1)

Or, really, shouldn’t this say the inverse

Yes, wording was not great, it wasn’t meant to be final. I don’t like the double negative there, but it does specify the thing I was trying to say better. It could be explicit:

This field specifies the earliest Python version(s) that the distribution is known to be compatible with, and must not have an upper bound.

(Again, option 1)

I don’t see a fundamental difference between newly added caps on the requires-python field and newly added caps on the dependencies listing.

There is a key, fundamental difference (ironically, only for PyPI and not for Conda) - you can’t “solve” for the Python version. For a package, if you have a limit <3, a valid choice is to pick version 2. You can’t do this for Python. In conda, you actually can, though Python version is so important that much of the time, you select an exact (minor) release of Python directly. So the only choice is to scroll back through time and try to get an older version of the package with the cap.

In other words, click<8 will very likely give me click==7; only if a package pins click>=8 will modern solvers start looking through history. At least some of the time, I still get a valid solve. Capping the Python version will never scroll back the Python version, and so will never give me a valid solve.

This is also why lock files treat the Python version fundamentally differently. Packages are locked (hence the name, lock files), while Python is never locked (again, not including conda). You can’t specify Python 3.9.9 or Python 3.9 or Python anything, instead you get a range (in a lock file!) of Python’s where this lock file is valid - you don’t know what Python version will try to install from this lock file. This is why PDM and Poetry force you to cap Python if python is capped by a dependency (to match the lock file limits), but they do not require you to cap any other dependency, because those are locked to a single value only in the lock file.

Adjusting to a new version of python requires work. Look how long it is taking for conda-forge

Over half of conda-forge is already converted, and conda started late. A good portion of it is just waiting for CI to run and a maintainer to push a button. Half of the remaining packages are waiting for parents to be ready. Brew was much quicker (almost same day as the release), and it got a massive number (almost 1/3 - 1/2 IIRC) within a week or so. Lots of packages do not require changes. Some do, but many do not. Also, both systems are highly biased toward binary packages; pure Python packages are much less likely to require changes. Some of the changes are just getting the test suites to pass without deprecation warnings! If you add caps, 100% of them do. But I was also referring to the design intention, which is that “good” packages should work on the next two Python versions, approximately.

But version capping doesn’t satisfy that requirement, because it results in older versions (which pre-dated the addition of the cap) being downloaded. So let’s work out what they actually want so we can design a solution that does work. (I assume what they want is something along the lines of “ensure that tools never install any version of numba older than 0.55 on 3.10 or later”, and that can’t be done using only metadata on versions 0.52 and later).

I don’t have a solution here - it’s a hard problem. But applying the wrong solution and trying to patch it up to work, seems to me like the worst of both worlds.

2 Likes

because it results in older versions (which pre-dated the addition of the cap) being downloaded

This is currently the case, but Solution 2 would actually give Numba the solution they would like. I had a mix of good and bad points for that, but that is a “solution” for them (which currently does not work). The other solutions would actually be closer to solutions to their problem too - removing the capping would at least get the most recent version, so they could work on a way to break with the correct error message (the old versions don’t break correctly).

The biggest problem with Solution 2 was it makes it really easy to misuse capping to put limits on libraries. And it doesn’t fix the issue with locking package managers without their involvement; they would also have to avoid back solving for Python upper caps too.

For any solution, I am kind of volunteering to work on the needed .overlap method for SpecifierSet. :slight_smile:

I’ll just summarize what I have written in SciPy discussion here just to kickstart the argumentation. Not really knowlegable in this tooling however I do know professionally since I am doing similar things for a living it is not users fault if majority of them are getting it wrong.

First one is the easy one. As mentioned previously, Requires-Python is bad naming. Anybody who is not versed in packaging will not get this right. I didn’t even know there will be a difference. It says exactly which Python versions it will accept to be installed. There is not even a hint of these things.

That’s probably correct however this is exactly the problem that these tools don’t offer anything. The argument above is that future versions might break something. To that we unfortunately say it does break something. I do read the post multiple times but you are not addressing this very point. We don’t want to maintain 9 versions for 4 major python releases all the time. Things should error out when the time comes so that we can nudge people forward. This is not upper capping this is deprecation.

Say package v2.0 released for Python 3.8. Everything was nice and dandy. Then came Python 3.9 and 3.10 this is going to be installed by users all the time and all will go to the repo of package and mention that version 2.0 is broken. Now you are leaving the package author under both obligation that nobody intended for (maybe it was meant only for Python 3.8 who knows) and also you are burdening everyone to keep up which is simply design by wishful thinking we don’t have that capacity anywhere. Things rot and should have the right to rot. As mentioned SciPy was broken with every new Python major release since 3.7 so this is not some arbitrary package complaint. And we spend a lot of time with every major release

Thus with some other extra reasons, I disagree with the sentiment and I also find it “horrific” to check the versioning backwards to find a working solution. I don’t know who ordered it but installing any version that doesn’t complain in a metadata tag is not really the solution to anything. I don’t know what it solves but just randomly trying versions out does not also constitute a solution to blame others for bad practice

Sorry I wasn’t precise enough here. What I’m referring to is the version of the Python runtime that the package was attempting to be installed under; unlike conda, pip of course cannot upgrade or downgrade the Python version to satisfy such a dependency (EDIT: Which I now see you talk about later).

I think our confusion may be that I’m referring to the behavior of user-facing installation tools, not developer-facing build tools here. Did you mean to only apply the above to the latter?

Sorry if I wasn’t clear—I’m talking about idea 2, “Implement upper capping properly” here, as was the text of yours I was quoting. The title and the suggested text imply that a package installer would only raise an error if the upper cap would not be satisfied for the version of the Python runtime that the package was attempting to be installed under, instead of back-searching, while still allowing installation if it was. But the text here:

Implies that tools would produce an error if any upper bound was used, even if the supported specifierset could be satisfied by the Python version the user is attempting to install under, which would mean that upper caps would not be usable at all (essentially a stronger version of Idea 1). What I was trying to clarify was precisely the circumstances under which you were suggesting tools raise an error versus install, and to follow up on that now, how you are suggesting this apply to build vs install tools.

Again, this wording was quoting Idea 2, not idea 1, while the above wording appears to be for the latter. The issue was the proposed wording for idea 2 didn’t really say anything different to the current wording, as the statement and its inverse have fundamentally different meanings.

Just to note, @henryiii addresses this both in the linked blog post, and the Scipy issue you were part of. IPython originally asked for this in order to be able to drop old Python versions, and for this it works just fine, as @henryiii explains above and elsewhere. So long as the specifier is added with or before the first version to drop support for earlier Python versions, which is generally reasonable, if the user’s Python version (say, 2.7 or early 3.x) is unsupported by the most recent version of a package, pip will search back until it finds a package version that does support it, or lacks the tag. This has been the solution to this problem for a very long time, and generally speaking has worked as intended.

The whole point seems to be that this behavior breaks down and doesn’t make much sense when applied to upper bounds on future Python, which it presumably wasn’t really intended for (but that wasn’t made clear), which it sounds like you actually agree on. He then proposes multiple practical solutions to address the issue for the upper bound case, with varying degrees of tradeoff, none of which are just “blam[ing]” downstream package authors, but rather either (1) clarifying the documentation as to the purpose of this field but retaining the existing behavior, (2) allowing upper bounds but not triggering the behavior you object to in that case and rather erroring instead, as you presumably want, or (3) ignoring the upper bound in that field while pivoting toward other, more appropriate (package manager specific) methods of handling the situation.

1 Like

Sorry, I thought I already answered this in the other thread. It solves a very important problem that SciPy is heavily using. Load up Python 2.7 and type pip install scipy. It works. Why? Because Pip checks scipy 1.73, and Requires-Python doesn’t match "2.7.19". It then checks the version before, the version before, and finally ends up on whatever version was the last one to match 2.7 (don’t remember which one). It pip installs it, boom, working SciPy - even though the SciPy developers don’t bother with 2.7 anymore. Same thing for 3.5, 3.6, and very soon 3.7. This is what Requires-Python was designed for, and it works wonderfully. This is why NEP 29 was able to specify such high limits - it’s not that everyone on Python 3.7 (which is the most popular version of Python still last I checked) is now broken with the PyData community dropping them, they just get older packages to go with older Pythons. It’s great!

It just was not designed for upper limits, and that’s exactly what this whole post is about and all three solutions listed cover. That’s why the behavior is so weird and terrible for upper limits. Please read the initial post.

the version of the Python runtime that the package was attempting to be installed under

My point is that makes no sense to try to develop a package with Python 3.10 if you cap <3.10, so I don’t see how you can have three variations here. I thought you were talking about solution 1; any Requires-Python that had an upper limit would produce a warning for solution 1. If you were referring to solution 2, I might not have been making much sense as I was replying to the wrong things. :slight_smile:

Implies that tools would produce an error if any upper bound was used

Sorry, yes, I think I thought you were talking about solution 1. For solution 2, if you were using Pip on 3.10.2 and tried to install a package, and then computed requires_python.overlap("<3.10.2") and that returned None (no overlap), then pip would not check older versions, and would just return an error message stating “This package requires {requires_python}”. If there is an upper cap and that upper cap has no overlap with the versions above the current version, then this package is not valid for the “back search”.

1 Like

I should have mentioned my opening remark more strongly. I’m just carrying my confusion and the arguments here. Not necessarily repeating to be discussed again. My apologies

1 Like

I think another way to phrase this is requires-python says what may work by excluding what is guaranteed not to work. This is the same philosophy that wheel tags takes: tags express what may be compatible and thus exclude what has no chance of working. Using an upper bound is prematurely guessing what may not work in the future and that goes against what requires-python is meant to express (under this definition).

This also makes the term “requires” accurate as you’re requiring Python at least match the specified requirement to have any hopes of working. After that you hope you have what’s necessary to install and use the package (which is impossible to fully express with metadata for all scenarios w/o resorting to containers).

7 Likes

Yeah, as mentioned (and you later figured out) I was referring to solution 2, and talking about user-facing install tools rather than developer-facing build tools. :slight_smile:

Yep, this is the third interpretation, and the one I thought you meant. I was just confused by some of your wording, sorry.

I have to disagree. This is a very confusing statement for anybody who has no idea what this tag is for. Everywhere in the ecosystem requirements are needed unconditionally. Even “requirements.txt” is an example of this behavior not “well at least with these might work”. Quite the contrary you must have these requirements. The word here, I repeat is unfortunate and should be explained better.

I disagree again. This is not premature guessing. This has been practically demonstrated with all Python major releases both in NumPy and SciPy. We, well I should speak for myself, at least I do not share the faith or the optimism of backwards compatibility of the Python ABI and/or its standard modules as we have been actively struggling with these.

2 Likes