I mostly agree with this. However, I do have one major reservation, which is that I think there’s a significant confusion over the different use cases:
Standalone pyproject.toml for a project in its own directory.
Metadata embedded in a script file that’s intended to be run as an independent program.
Metadata embedded in a Python file that’s part of a larger component (an importable module, or an application’s driver script).
Single-file projects where a wheel gets built from one Python file.
The problem is that all of (2), (3) and (4) are “single file” cases, but they have very different semantics. So they may want different metadata, different rules for combining values from multiple sources, etc.
PEP 722 was focused purely on case (2). PEP 723 drifted into the area of (3) and (4) as well (especially with the discussion on the [tool] section). It may well be that the right answer is to have completely independent mechanisms for addressing the different cases, with different syntax, and different sets of allowed metadata. But if we focus solely on “how do we embed metadata in a single Python file?” we will miss the question of whether different use cases have different requirements - and when we inevitably come back to it, we might find it’s too late.
This is why I prefer to discuss use cases such as the “better batch file” scenario, rather than principles or hypothetical workflows. One syntax per use case is obviously a bad idea, but so is a syntax designed by committee to solve everything, that is good at nothing.
I wish we could simply try to solve the “better batch file” use case, and leave the rest for when we have more real world experience of what people want in those situations. But I seem to be in a minority - so I’m simply trying to make the best of a bad job and ensure that whatever does come from the debate is at least reasonably usable in that situation.
I would be more comfortable if someone asked the projects, rather than assuming I can speak for them
Yes, if it’s completely legal to pretend that python-requires doesn’t exist, I’m not going to say this is a showstopper. But it’s still (IMO) pretty impolite to write a standard for a use case that no-one but pip-run and pipx cared about until now, and not do our best to make sure that it works for them.
Yes, precisely. And also, sorry. I assumed (it went without saying that) this potential discussion/PEP would absolutely be framed around these concurrent discussions going on and the various use-cases they contain.
Within a larger discussion, we can arrive at one-way or multi-way explicitly. Either way we prove to the “Python’s ecosystem is very fragmented” crowd that the result was intentional, not accidental.
My fear is we end up implicitly choosing a solution because we do did so with too much blinders on, and it was the wrong one.
I know doing this sort of thing takes even more time and even more energy though. I’m happy to do my part to help.
Oh, definitely! My team will keep me honest on this one. It’s also why I’m trying to not guide the discussion too much as I want to avoid personal bias as much as possible.
Your package/distribution requirements are a part of what’s required to run your code. Thinking in terms of constructing an environment to run your code with, you need a Python interpreter and any dependencies your code depends on. Your requirements make up the latter need, while both combined make up what’s required. And sense you can’t infer one from the other, some folks are asking for a way to specify the required Python version.
Somewhat. I view this discussion more as whether @ofek and @pf_moore can agree on a joint PEP and what that might entail, while the other topic is more about the details of what a new TOML table might look like to specifying what applications need to run, of which single-file scripts can be viewed as a subset and quite possibly would get used by the hypothetical joint PEP.
To be clear, I am personally very much looking at this whole thing as a PEP delegate optimizing for use case 2. If other use cases are somehow enabled by the outcome then that’s a bonus, but I am not optimizing for it. For instance, the reuse of TOML would be for simplicity of explanation of the format (assuming we find out beginners don’t find learning TOML difficult), and for reuse of knowledge/documentation where you learn something once and it works regardless of where you write it down. I personally don’t view TOML as important to empower avoiding creating a full-blown project directory for as long as possible.
Various authors have been @ mentioned, but if we don’t hear from them by the end of the week I will go and ask the projects via their issue trackers.
FYI the Python Launcher for Unix will support the outcome of all of this. Developing subcommand support for the Launcher was to open up two specific use cases where there wasn’t a standard and thus I didn’t feel comfortable baking into the py command itself: a py pip command that automatically creates a virtual environment as needed, and a py run that was going to do with pipx run supported via your change. But if this all becomes a standard I will simply embed the py run command since it will be following a standard not be something that needs to evolve as a separate thing where I have to try and match some other tool.
And in this specific case, the required Python version becomes an implicit filter on what Python interpreter to use to construct the environment to run the code with.
Trying to keep all the lively discussion between these threads straight, hopefully this is on-topic.
I agree that the different use cases complicate the appropriate way to standardize this type of metadata.
As a user I view this type of script file as a more short-lived or rarely used program with minimal caveats probably originally written to use python instead some monstrosity of bash+gnucoreutils.
I can see the value of somehow specifying a required python version, but if tools are just going to error out then it doesn’t offer much value over a simple included check:
if sys.version_info < (3, 8):
raise RuntimeError("python version >= 3.8 required")
Scripts I write that fall into (2) aren’t typically distributed though so I have better control of the base environment. I favor making a properly importable/installable package with more complete metadata for something distributed, but I can imagine users sharing standalone scripts and still wanting things to “just work”.
I think you mainly mean pipx and pip-run but as the maintainer of a direct competitor, viv, with features from both of these libraries. I’d say at best I would support producing an error/warning, if that’s the agreed solution, but I don’t think it adds significant value in that instance over the snippet above.
I would think to a new user it’s more confusing to have multiple toml data tables [run] in a file vs [project] in a pyproject.toml. The use of toml however makes a lot more sense if it’s to provide easier extensibility to other tools down the road outside of [run] table, but that does feels outside the scope of PEP722/723.
I meant the confusion of seeing in a python file your co-worker shared with you the below.
dependencies = [
But then later you wrote a script yourself an used what you had seen before in other people’s pyproject.toml which was
dependencies = [
Then getting an error or worse no info if tools don’t care that their isn’t a [run] in the toml block.
I think pipx run currently has no issues being a script runner for a plain python file with no dependencies, not sure how pip-run behaves.
But maybe it’s obvious enough to users the difference usage case of [project] and [run].
Okay, that makes sense, but “requirements” and “what’s required” are pretty much synonyms in everyday usage, so hopefully we can come up with some better terminology (at least by the time any of this has to be documented for a general audience). Like maybe one way would be to distinguish “pip-installable requirements” from “non-pip-installable requirements”? But. . .
You can infer Python from the others if Python is “just another dependency”. I don’t want to belabor this too much, but I do want to take the opportunity to point out why I think that very high-level choices like manager-first vs Python-interpreter-first are just so relevant to thinking about these things, and why I think it would be beneficial to really consider them in addition to the more immediate concerns. If Python is just another package in the dependency tree, it seems to me that many questions like the one under discussion would just become immensely easier, because there is no longer a need to have two parallel tracks for specifying versions, or to worry about what may happen if the installer tool doesn’t control the Python version, and so on.
i.e., does an environment manager control the installation of Python (a la conda), or does Python control the installation of an environment manager (a la venv) ↩︎
But that’s not how Python packaging has ever worked, and would require a good bit of work to rethink the function of launchers and python version managers and using pybi and what about the impact on python in browser, and as such is not a relevant item but a distraction in the discussion of this set of PEPs.