There is definitely a use case for being able to define the Python version required, but at the same time, being able to support that is also significantly more complex. I agree here that at least for now it should be optional. PEP 722/723 already provide a lot of value as is. This part, while also providing value, should not hinder this PEP. Let it be up to the tools to decide how much they want to lock with the information that is given. Personally, I like to be able to use this dependency specification with locking data in the same file, but that’s clearly also another step further and out of scope here.
Exactly this. People have different requirements, and I think we can consider both use cases.
I suggest for this PEP we keep the
requires-python optional, as it’s far easier to implement, and to get more support for the important part of this PEP: non-interpreter dependencies. At the same, it would be great if, while being optional, it is encouraged to record
required-python and considered during execution to improve the reproducibility when executing. And that tools executing the script are encouraged to warn about mismatching Python version so there is no surprise when features are used in the script that are not available with the running Python.
Then it is entirely up to whoever wrote the script to decide whether they care about that level of reproducibility or not and choose tooling for it accordingly.
Paul, I think (correct me if I’m wrong) you’re interpreting my advocacy of the Python requirement as being exclusively about my maintainership of Hatch and I feel (again, could be wrong) that you are finding that quite offputting.
While it is true that Hatch would be able to satisfy the Python constraints it is not true that that tool would be the sole beneficiary as other tools could also support that like the
py launcher. Furthermore, and most importantly, I think you should look at it more as not a feature per se but rather UX for ensuring proper run-time behavior. If
pip fails to install when the Python version is not met then certainly any runner should follow suit lest the user encounters random errors as Brett was saying.
As far as being (possibly) perturbed by my advocacy, I don’t really know how to fix that. It’s not as if I’ll be writing this feature just for fun like “oh wow if I write this and this here and there then cool stuff happens! progress bars so pretty!!”. No, not at all. Development takes a significant time out of my personal life like the rest of us. Supporting Python management is a feature that will tremendously help users and has been requested outside of Hatch for over a decade. Users simply don’t want to deal with that, nor should they have to.
Pants has the notion of applying an interpreter constraint to every python file. I could see us scraping this new field to populate that metadata on behalf of the user.
That being said, my vote is still that this is given piecemeal. First we decide on a format (because we haven’t done that yet) in a way that most people are ok with (Ofek wants toml, Paul’s OK with it but not pyproject.toml, Brett’s working on user surveys, and then everyones comments on the 3 threads). Then we can decide what goes in. It seems this discussion is starting to mirror, somewhat, Projects that aren't meant to generate a wheel and `pyproject.toml` so perhaps that’s not a coincidence?
I think for this to be the smoothest sailing, we might want to try and tackle each sub problem, or else we’re stuck discussing how to format things, or decide on multiple fields on PEPs dedicated to getting single-file scripts runnable.
I hope I’m not, but I can see how it would look that way.
My real concern here is that my main motivation for writing PEP 722 at all was to standardise existing practice, in such a way that tools like VS Code could interoperate with that existing behaviour without having to rely on implementation-dependent information. Therefore, to me it is a key point that those existing tools will adopt the new standard - otherwise, in my view, we’ve missed the whole point of standardisation.
With that in mind, whether tools that don’t currently implement a “run script with dependencies” feature (such as hatch) adopt this standard is largely irrelevant - if (for example) VS Code adds an “insert runtime dependency data” action that works with hatch, but doesn’t work with existing tools, we haven’t achieved any real form of interoperability, rather we’ve created a new approach and applied the weight of standardisation to enforce a type of “implement this or be left behind” pressure. I don’t believe that’s the right way to use the standards process.
As I say - all of this may be irrelevant. The existing tools may have no issue with adding support for a “requires Python version X” check. But that’s not (IMO) a foregone conclusion -
pipx run --python /path/to/my/python somescript.py doesn’t have access to the requested interpreter without an additional subprocess call, so checking the version is costly. As a result, I don’t know if I would agree to adding a Python version check - the benefit is (IMO) minimal and the cost is non-trivial.
All I’m really asking here is that we don’t add features to the standard that the existing tools providing solutions for this use case are unhappy with adopting. I’m really quite confused why that’s seen as such a big ask. But maybe I’ve not explained myself well enough - with luck, this clarifies things.
Maybe because I didn’t make a big point of it with PEP 722? But as I said, that proposal was specifically designed to match the existing implementations, so I felt comfortable assuming it wouldn’t be an issue. ↩︎
Would you be more comfortable if the error was SHOULD and satisfying the constraint is MAY?
I mostly agree with this. However, I do have one major reservation, which is that I think there’s a significant confusion over the different use cases:
pyproject.tomlfor a project in its own directory.
- Metadata embedded in a script file that’s intended to be run as an independent program.
- Metadata embedded in a Python file that’s part of a larger component (an importable module, or an application’s driver script).
- Single-file projects where a wheel gets built from one Python file.
The problem is that all of (2), (3) and (4) are “single file” cases, but they have very different semantics. So they may want different metadata, different rules for combining values from multiple sources, etc.
PEP 722 was focused purely on case (2). PEP 723 drifted into the area of (3) and (4) as well (especially with the discussion on the
[tool] section). It may well be that the right answer is to have completely independent mechanisms for addressing the different cases, with different syntax, and different sets of allowed metadata. But if we focus solely on “how do we embed metadata in a single Python file?” we will miss the question of whether different use cases have different requirements - and when we inevitably come back to it, we might find it’s too late.
This is why I prefer to discuss use cases such as the “better batch file” scenario, rather than principles or hypothetical workflows. One syntax per use case is obviously a bad idea, but so is a syntax designed by committee to solve everything, that is good at nothing.
I wish we could simply try to solve the “better batch file” use case, and leave the rest for when we have more real world experience of what people want in those situations. But I seem to be in a minority - so I’m simply trying to make the best of a bad job and ensure that whatever does come from the debate is at least reasonably usable in that situation.
I would be more comfortable if someone asked the projects, rather than assuming I can speak for them
Yes, if it’s completely legal to pretend that
python-requires doesn’t exist, I’m not going to say this is a showstopper. But it’s still (IMO) pretty impolite to write a standard for a use case that no-one but
pipx cared about until now, and not do our best to make sure that it works for them.
Yes, precisely. And also, sorry. I assumed (it went without saying that) this potential discussion/PEP would absolutely be framed around these concurrent discussions going on and the various use-cases they contain.
Within a larger discussion, we can arrive at one-way or multi-way explicitly. Either way we prove to the “Python’s ecosystem is very fragmented” crowd that the result was intentional, not accidental.
My fear is we end up implicitly choosing a solution because we do did so with too much blinders on, and it was the wrong one.
I know doing this sort of thing takes even more time and even more energy though. I’m happy to do my part to help.
Oh, definitely! My team will keep me honest on this one. It’s also why I’m trying to not guide the discussion too much as I want to avoid personal bias as much as possible.
Your package/distribution requirements are a part of what’s required to run your code. Thinking in terms of constructing an environment to run your code with, you need a Python interpreter and any dependencies your code depends on. Your requirements make up the latter need, while both combined make up what’s required. And sense you can’t infer one from the other, some folks are asking for a way to specify the required Python version.
Somewhat. I view this discussion more as whether @ofek and @pf_moore can agree on a joint PEP and what that might entail, while the other topic is more about the details of what a new TOML table might look like to specifying what applications need to run, of which single-file scripts can be viewed as a subset and quite possibly would get used by the hypothetical joint PEP.
To be clear, I am personally very much looking at this whole thing as a PEP delegate optimizing for use case 2. If other use cases are somehow enabled by the outcome then that’s a bonus, but I am not optimizing for it. For instance, the reuse of TOML would be for simplicity of explanation of the format (assuming we find out beginners don’t find learning TOML difficult), and for reuse of knowledge/documentation where you learn something once and it works regardless of where you write it down. I personally don’t view TOML as important to empower avoiding creating a full-blown project directory for as long as possible.
Various authors have been @ mentioned, but if we don’t hear from them by the end of the week I will go and ask the projects via their issue trackers.
FYI the Python Launcher for Unix will support the outcome of all of this. Developing subcommand support for the Launcher was to open up two specific use cases where there wasn’t a standard and thus I didn’t feel comfortable baking into the
py command itself: a
py pip command that automatically creates a virtual environment as needed, and a
py run that was going to do with
pipx run supported via your change. But if this all becomes a standard I will simply embed the
py run command since it will be following a standard not be something that needs to evolve as a separate thing where I have to try and match some other tool.
And in this specific case, the required Python version becomes an implicit filter on what Python interpreter to use to construct the environment to run the code with.
Trying to keep all the lively discussion between these threads straight, hopefully this is on-topic.
I agree that the different use cases complicate the appropriate way to standardize this type of metadata.
As a user I view this type of script file as a more short-lived or rarely used program with minimal caveats probably originally written to use python instead some monstrosity of bash+gnucoreutils.
I can see the value of somehow specifying a required python version, but if tools are just going to error out then it doesn’t offer much value over a simple included check:
import sys if sys.version_info < (3, 8): raise RuntimeError("python version >= 3.8 required")
Scripts I write that fall into (2) aren’t typically distributed though so I have better control of the base environment. I favor making a properly importable/installable package with more complete metadata for something distributed, but I can imagine users sharing standalone scripts and still wanting things to “just work”.
I think you mainly mean
pip-run but as the maintainer of a direct competitor,
viv, with features from both of these libraries. I’d say at best I would support producing an error/warning, if that’s the agreed solution, but I don’t think it adds significant value in that instance over the snippet above.
I would think to a new user it’s more confusing to have multiple
toml data tables
[run] in a file vs
[project] in a
pyproject.toml. The use of
toml however makes a lot more sense if it’s to provide easier extensibility to other tools down the road outside of
[run] table, but that does feels outside the scope of PEP722/723.
toml data tables” are you thinking of? Unless
[run] is defined to be repeatable (which I don’t think it would), there would only be one embedded in a file.
I meant the confusion of seeing in a python file your co-worker shared with you the below.
[run] dependencies = [ "requests" ]
But then later you wrote a script yourself an used what you had seen before in other people’s
pyproject.toml which was
[project] dependencies = [ "requests" ]
Then getting an error or worse no info if tools don’t care that their isn’t a
[run] in the toml block.
pipx run currently has no issues being a script runner for a plain python file with no dependencies, not sure how
But maybe it’s obvious enough to users the difference usage case of
Suppose your script uses a
match statement. If you add this kind of check, running it on older Python will raise a
SyntaxError before even having executed this check.
There was a suggestion to use
requirements instead of
dependencies to be more distinct (same with
Okay, that makes sense, but “requirements” and “what’s required” are pretty much synonyms in everyday usage, so hopefully we can come up with some better terminology (at least by the time any of this has to be documented for a general audience). Like maybe one way would be to distinguish “pip-installable requirements” from “non-pip-installable requirements”? But. . .
You can infer Python from the others if Python is “just another dependency”. I don’t want to belabor this too much, but I do want to take the opportunity to point out why I think that very high-level choices like manager-first vs Python-interpreter-first are just so relevant to thinking about these things, and why I think it would be beneficial to really consider them in addition to the more immediate concerns. If Python is just another package in the dependency tree, it seems to me that many questions like the one under discussion would just become immensely easier, because there is no longer a need to have two parallel tracks for specifying versions, or to worry about what may happen if the installer tool doesn’t control the Python version, and so on.
i.e., does an environment manager control the installation of Python (a la conda), or does Python control the installation of an environment manager (a la venv) ↩︎
But that’s not how Python packaging has ever worked, and would require a good bit of work to rethink the function of launchers and python version managers and using pybi and what about the impact on python in browser, and as such is not a relevant item but a distraction in the discussion of this set of PEPs.
Ah yes that would be a problem. I rarely write code that isn’t syntactically correct for any python >3.6 and misremembered how the interpreter worked.
requires-python to warn the user may prove helpful then if sharing code that isn’t backward compatible.
Here is the updated PEP Final - PEP 723: Embedding pyproject.toml in single-file scripts
Since the latest draft of the PEP introduces
[run], closing this so future discussions happen over on Final - PEP 723: Embedding pyproject.toml in single-file scripts .