PEP 723: use a new `[run]` table instead of `[project]`?

Thanks for yours and @ofek responses.

So when we say commented TOML, it could look like(?):

# [run]
# dependencies = [
#   "rich", # comment
#   #comment
#   "requests=1.0",
# ]

Yup, that is completely valid TOML

1 Like

Erm, I’m not sure I understand?

I don’t see beginners using this (or any) PEP in relation to Python version management. Beginners should basically never have more than one version of Python installed, so they have no need for such a thing.

And given that I’ve already said that I don’t like framing this as being about “beginners”, I’ll note that the same thing applies to anyone (beginner, expert, or anywhere in between) who uses Python for the “better batch file” use case, or as a tool to enable them to do a job that isn’t “programming a Python application”.

Okay here’s an example: a user can only download a single tool because that is what they read online and is easy for them. By default, their system provides Python 3.10 and they want to use match statements from 3.11. What is best for them between a tool doing it for them and learning how to manage Python installations?

I think the PEP 722 extension of allowing X-User-Defined-Thing might be the best solution for effectively punting on the version management issue?

There are lots of target users who are only vaguely aware that there are multiple versions of python. 2 and 3! :stuck_out_tongue_winking_eye:
They get python via the installer or linux distro packaging, and they have no need for a python version specifier.

But I’d also like to embed some of those “better batch scripts” in shared places for use by other members of my team. e.g. a scripts/ directory in the root of a common repo.
In that case, specifying a minimum python version might be important, even if it just results in an error for team members running old pythons. But the script author in this case is a well-versed python user who can lead off with a if sys.version_info check in the worst case.

If use cases are so numerous that there’s quickly demand to expand PEP 72{2,3} to support python_requires, I don’t think that kind of additive change to the spec is a major issue.

1 Like

I must say I do not understand the debate about supporting or not supporting python-requires.

What does “supporting” mean here? If it’s, for pipx or pip-run, a matter of erroring out if the current Python version is incompatible with the python-requires value… that should be ~5 lines of code? I can’t imagine that they’d be OK with implementing PEP 723 except requires-python.

“Supporting” could also mean Python version management features (by which I mean auto-downloading and installing given Python versions), but I don’t really see pipx or pip-run doing that… and it’s fine? I mean, as long as packaging is organized around several tools and PEPs do not have the power to dictate the behavior of those tools, is there a problem with a python-requires metadata being used by certain tools to restrain scripts that can be run (hard error if Python doesn’t match), and some others to auto-manage Python versions (run the script with a compatible Python)?

Here I’m confused, because you said (IIUC) that you preferred no [tool] table in this iteration of the proposal, but this sounds a lot like a different incarnation of it.

It’d be useful to have some actual input from pipx/pip-run maintainers. Unfortunately GitHub doesn’t list repository maintainers, but at least @bernatgabor is a maintainer of pipx on PyPI.

pip-run is maintained by @jaraco.

I don’t think that this is what these PEPs (primarily driven by package requirements) should be seeing as vital constraints. Leaving room in a specification so as not to preclude future metadata is fine, but I’ve commented before about a concern on scope-creep, which I do want to re-iterate—I’m unsure why the run.dependencies proposal needs to have anything to do with the interpreter.

A

1 Like

We could require the below, but I fear that would limit what you’re looking for in terms of future expansion of the [run] table.

run.dependencies = [
"...",
]

Perhaps # Script Metadata as a marker – I think # [run] on its own is likely too generic, even if not currently used much.

A


P.S. @moderators would it be possible to split the recent discussion into a [dependencies.run] thread please? (perhaps starting at PEP 723: Embedding pyproject.toml in single-file scripts - #123).

I only say this as it might be a helpful insight into someone ignorant of pyproject.toml: The specifiers threw me off. I was stumping after I first tried:

[run.dependencies]
package1=1.0

In a TOML linter. It was accepted as a key/value pair. So then I tried

package2 # without any specifier

And of course it balked at the missing “=“. And then I was getting very confused at how “<“, “>”, etc work! Of course now I understand we are assigning a list of strings to “dependencies”.

2 Likes

I think your confusion (which I share) stems from “supporting python-requires” meaning either

  • allow the key but don’t stipulate precise semantics
  • allow the key and stipulate exact semantics
  • exact semantics, but with “SHOULD” rather than “MUST”

If it’s imprecise, one audience is unsatisfied because of the imprecision and potential for tools to diverge. If it’s fully specified, another audience might be displeased because we’re trying to set behavior for tools without necessarily having full grasp of what’s needed.

I think it’s unlikely that we’ll find a version of the python_requires data which will satisfy everyone. Putting it in some kind of extension space spares us having to sort it out right now.


The extension space for the [run] table is very different from [tool] in that one is already in use by tools and one is not. So it dodges a lot of the issues about tool configs in different places unless tool authors want to explicitly go after those sorts of problems.

1 Like

No problem! I had my team read through both PEPs this morning, but it was mostly to get them into the right head space, explain the background, to start thinking about how we could do a user study around this, etc.

I’m in no rush to make a final decision and we can tweak stuff along the way. It will take some time to line up a user study anyway, so while @courtneywebster thinks through how she wants to run one, we can iterate as needed. We can also make sure that any user study asks key high-level questions (e.g., formatting) that is going to come up regardless, as well as other questions such as would supporting a [tool] table be too confusing, is TOML easy enough to learn regardless of what table(s) are supported, etc. Plenty to get beginner feedback on that isn’t inherently tied to any future change we make here.

They can error out if a version of Python that’s required isn’t installed in a much clearer way than some SyntaxError or AttributeError for something in the stdlib. And this is in the easy cases of something obviously missing compared to some subtle semantic change.

For example, a tool could say, “the specified file requires Python >=3.8, but only Python 3.7 was found. Please install an appropriate version of Python.” which is clearer than TypeError: 'type' object is not subscriptable (which is something I ran into trying to backport some code in packaging to Python 3.7).

And this is one of the key reasons we are going to do user studies as I think we need some independent feedback from beginners as to what format they find the least confusing.

2 Likes

I think the whole requires-python question comes down to whether we want to be writing down what’s required to make an single-file script/application run, or is this just a list of requirements that must somehow be installed? If it’s the former than the version of Python that the code requires is part of the information to run the script (and I think the only thing necessary that I can think of beyond its dependencies). But if it’s the latter case of just requirements then the Python version requirement is obviously superfluous. The importance of this probably plays into whether you think such scripts should be as shareable as possible across machines as that’s where specifying the required Python version plays into this.

6 Likes

I’m not sure I get you here. Why is it specifically important that two particular tools (pipx and pip-run) implement the behavior rather than some other particular tool (hatch)?

I think that’s a good solution, as long as the spec is defined to be extensible in that manner. That is, if the spec says “there is a requirements block, that’s it”, then that doesn’t leave scope for hatch to add Python-requirements. Of course, hatch can add that as a totally separate magic comment, but then we’ll just be back in the same situation we’re in now, with some tools implementing behavior that hasn’t yet been standardized.

In my experience that isn’t entirely true. I’m someone who often wants to use Python for the better batch file case, but even so, this sometimes does require thinking about different Python versions. Different “better batch file” scripts may require different Python versions, and it’s not always possible to just upgrade to the latest one and use that for all. For instance, you may have some of these batch files that need to be run with some installed Python that you don’t control (e.g., because they’re doing system tasks), but at the same time may have other “personal” batch files you also want to run that require a newer version of Python, and for these ones you may be able to run them in an environment where you do control the Python version.

I continue to think it is best to simply think of the Python version as part and parcel of what is required to run a script.

That sounds wonderful. I do hope that we can remain open to the possibility that user feedback may lead us towards a solution which differs from all of the current proposals.

What distinction are you drawing between “what’s required” and “requirements”?

1 Like

There is definitely a use case for being able to define the Python version required, but at the same time, being able to support that is also significantly more complex. I agree here that at least for now it should be optional. PEP 722/723 already provide a lot of value as is. This part, while also providing value, should not hinder this PEP. Let it be up to the tools to decide how much they want to lock with the information that is given. Personally, I like to be able to use this dependency specification with locking data in the same file, but that’s clearly also another step further and out of scope here.

2 Likes

Exactly this. People have different requirements, and I think we can consider both use cases.

I suggest for this PEP we keep the requires-python optional, as it’s far easier to implement, and to get more support for the important part of this PEP: non-interpreter dependencies. At the same, it would be great if, while being optional, it is encouraged to record required-python and considered during execution to improve the reproducibility when executing. And that tools executing the script are encouraged to warn about mismatching Python version so there is no surprise when features are used in the script that are not available with the running Python.

Then it is entirely up to whoever wrote the script to decide whether they care about that level of reproducibility or not and choose tooling for it accordingly.

Paul, I think (correct me if I’m wrong) you’re interpreting my advocacy of the Python requirement as being exclusively about my maintainership of Hatch and I feel (again, could be wrong) that you are finding that quite offputting.

While it is true that Hatch would be able to satisfy the Python constraints it is not true that that tool would be the sole beneficiary as other tools could also support that like the py launcher. Furthermore, and most importantly, I think you should look at it more as not a feature per se but rather UX for ensuring proper run-time behavior. If pip fails to install when the Python version is not met then certainly any runner should follow suit lest the user encounters random errors as Brett was saying.

As far as being (possibly) perturbed by my advocacy, I don’t really know how to fix that. It’s not as if I’ll be writing this feature just for fun like “oh wow if I write this and this here and there then cool stuff happens! progress bars so pretty!!”. No, not at all. Development takes a significant time out of my personal life like the rest of us. Supporting Python management is a feature that will tremendously help users and has been requested outside of Hatch for over a decade. Users simply don’t want to deal with that, nor should they have to.

4 Likes

Pants has the notion of applying an interpreter constraint to every python file. I could see us scraping this new field to populate that metadata on behalf of the user.


That being said, my vote is still that this is given piecemeal. First we decide on a format (because we haven’t done that yet) in a way that most people are ok with (Ofek wants toml, Paul’s OK with it but not pyproject.toml, Brett’s working on user surveys, and then everyones comments on the 3 threads). Then we can decide what goes in. It seems this discussion is starting to mirror, somewhat, Projects that aren't meant to generate a wheel and `pyproject.toml` so perhaps that’s not a coincidence?

I think for this to be the smoothest sailing, we might want to try and tackle each sub problem, or else we’re stuck discussing how to format things, or decide on multiple fields on PEPs dedicated to getting single-file scripts runnable.

3 Likes

I hope I’m not, but I can see how it would look that way.

My real concern here is that my main motivation for writing PEP 722 at all was to standardise existing practice, in such a way that tools like VS Code could interoperate with that existing behaviour without having to rely on implementation-dependent information. Therefore, to me it is a key point that those existing tools will adopt the new standard - otherwise, in my view, we’ve missed the whole point of standardisation.

With that in mind, whether tools that don’t currently implement a “run script with dependencies” feature (such as hatch) adopt this standard is largely irrelevant - if (for example) VS Code adds an “insert runtime dependency data” action that works with hatch, but doesn’t work with existing tools, we haven’t achieved any real form of interoperability, rather we’ve created a new approach and applied the weight of standardisation to enforce a type of “implement this or be left behind” pressure. I don’t believe that’s the right way to use the standards process.

As I say - all of this may be irrelevant. The existing tools may have no issue with adding support for a “requires Python version X” check. But that’s not (IMO) a foregone conclusion - pipx run --python /path/to/my/python somescript.py doesn’t have access to the requested interpreter without an additional subprocess call, so checking the version is costly. As a result, I don’t know if I would agree to adding a Python version check - the benefit is (IMO) minimal and the cost is non-trivial.

All I’m really asking here is that we don’t add features to the standard that the existing tools providing solutions for this use case are unhappy with adopting. I’m really quite confused why that’s seen as such a big ask[1]. But maybe I’ve not explained myself well enough - with luck, this clarifies things.


  1. Maybe because I didn’t make a big point of it with PEP 722? But as I said, that proposal was specifically designed to match the existing implementations, so I felt comfortable assuming it wouldn’t be an issue. ↩︎

1 Like

Would you be more comfortable if the error was SHOULD and satisfying the constraint is MAY?