How would you like to declare runtime dependencies and Python requirements for PEP 723?

Suppose this makes it into pyproject.toml and we come up with some meaning for [run] for projects with it there. Having both allows us to say “as a wheel, this project supports 3.8+, but when trying to run it from source, use 3.10+”.
Do I know for sure that such usage is important? No – because we don’t know what the [run] table means alongside the [project] table yet, I can’t answer to which capabilities are or are not important.

Until we have a better handle on the meaning in pyproject.toml, we should keep in mind that a single project may have reasonable use for multiple python versions. So I don’t think we should combine the fields based on the supposition that a project only ever needs one version constraint.

2 Likes

I have mixed feelings about how this should color the standards process.

On the one hand, you have a very clear and singular vision for hatch and hatch support. That’s awesome! Aside from the benefits for your users, it helps us to more easily grasp how things will impact hatch, etc etc. It’s just great!

On the other hand, hatch isn’t the only implementation. pipx and pip-run probably won’t have automated means for manipulating this block. So don’t we risk sidelining other implementers when we talk about what hatch will do?


There’s a sub-topic here about how much the data should be human writable vs machine writable.
My own attitude is that I like formats which are both, so that people can choose toolchains which reflect their preferences.

2 Likes

True, though I was talking about not just Hatch but also Poetry, PDM, Rye, etc. tools that kinda provide the full experience.

2 Likes

I had always assumed that if the [run] table made it to pyproject.toml, it would be mutually exclusive with [project].

It’s not absurd to dissociate the two. For example, it could prevent confusion regarding why optional-dependencies doesn’t “work” in inline project metadata.

I figured this poll was primarily about aesthetics/UX and we’d leave the other discussion to other threads, but I guess not. :sweat_smile:

I voted 3 because it felt the simplest to me (“this is just a piece of a pyproject file”). 2 is nice but it feels unnecessary to add top level keys (and define interactions with old ones) when they already exist.

The cleaner look of 2 could be achieved by definition of the 723 block: tools interpret the whole block as part of the [project] table. This constrains what it can do a little bit I always felt like this is a constrained use case anyway–more complex tools should move up to more complex config.

1 Like

(Disclaimer: I’m not 100% sure if you mean “in pyproject.toml” or “in a script” - or both. I’m trying to answer below in a way that applies to both, so apologies if that means I’m a little generic in my comments).

To be clear, that’s absolutely not the goal in any use case that I have. I very definitely expect and want to be able to manually enter my dependencies. Tools like hatch that support a command line way of saying “add this dependency” might be convenient[1], but I’d strongly object to any syntax which assumed that everyon would have access to them.

If Python ever stops being something that you can write using nothing more than a plain text editor, I feel that we’ll have lost something very fundamental.


  1. I say “might” because I don’t currently use such tools, so I don’t really know ↩︎

9 Likes

Hah! I think we are trying (and perhaps not always succeeding :wink: ) to focus on that, but without losing sight of the fact that “we love X in a script by have no idea what it means in pyproject.toml files!” is not a great outcome.

Part of the reason I like Option 2, of the choices given, is that I can define a meaning for it in pyproject.toml. I still don’t know what the other ones mean in a pyproject.toml file, other than “there’s a thing called running the project, which needs some dependencies”.

But what is “running the project”? I know what running a script is. Once the data is in pyproject.toml, it needs to have some meaning, doesn’t it? I think any proposal for a specific choice here needs to be paired with semantics for that choice when rendered in pyproject.toml.

All of this is why I’m going after this problem from the other end of the snake (python?) and trying to define data for pyproject.toml which I think PEP 723 will be able to pick up and use. It’s all waves hand in circular motion one big thing.

1 Like

Hm I guess I feel the same way about Option 3. requires-python and dependencies have a meaning in the context of building a wheel, and Option 3 says “these mean basically the same thing in the context of running a project script”.

As I hint at there, I do think it punts a little bit on “what does it mean to run a project”, and I also agree with @pf_moore and others that the answer there isn’t clear yet. I don’t think that answer needs to be clear to adopt Option 3 (or 2, for that matter), at least experimentally, and see how it goes.

1 Like

To add on a little bit to this: while using the existing keys from [project] doesn’t clarify what “running a project” is, it probably isn’t incompatible with how people are currently using pyproject.toml in those contexts. If you’re using the toml file for your config, even though you’re not publishing a wheel [1], I suspect you’re using requires-python and dependencies the way that they would be interpreted here. That’s how I do it, at least, but I don’t know all the other use-cases that well.


  1. but maybe you pip install . locally, so you build one ↩︎

I very strongly believe that we simply shouldn’t be trying to decide what any of this means in a pyproject.toml file yet. The reality is that we just don’t know, and by pushing for a solution, all we’ll end up doing is defining something that’s not fully thought through.

We don’t even have a consensus that “running a project” makes any sense at all, in general - it clearly does in restricted cases like webapps, but if we simply define something that makes sense for one more type or project, we’ll just be kicking the can down the road for all the other cases.

I’d much rather we just worked out how to write script dependencies, and dropped the whole idea that “it must match pyproject.toml”. I really don’t think users are so fragile that they’ll panic just because the way things are defined for “scripts” and “projects” look a bit different.

Maybe the example @ntessore came up with, where scripts that are part of a project might want to reference dependency groups from the project, might be worth looking at. But that can be a future extension to PEP 723, it doesn’t need to be defined right now. Let’s wait until PEP 723 and dependency groups are both accepted standards, and in real-world use, and then, if people express a need, we can design an extension.

Hmm, this makes me wonder. If we stick with the idea that whatever’s in PEP 723 must match what’s in pyproject.toml, what happens if we later add an extension that doesn’t make sense in a directory-based project? Do we reject the extension because it can’t fit the “everything must match” model? Do we contort the design to make it match?

4 Likes

It wouldn’t be a disaster if they didn’t match, but it’d be a little nicer experience if they do–in terms of learning/remembering the syntax in different places, and moving code between the two systems.

So, either option 2 or 3 would make sense from this perspective, with the tweak of “don’t modify the pyproject.toml spec at all”. The syntax is the same in both scripts and projects, and the meaning would be essentially the same. But if you are writing a pyproject.toml then name and version are still required in the [project] table, even if you don’t build a wheel[1].

Maybe the nuance that @brettcannon is introducing in option 3 is if we’re re-using the [project] table in scripts, then those keys become optional in the context of a script. But that doesn’t mean they become optional elsewhere.


  1. if you’re not building a wheel then the whole table is optional, but pip install won’t work ↩︎

1 Like

I originally defaulted to option 1 but have updated my vote to option 2 if we take this approach of an implicit namespace.

1 Like

Part of the reason some people expect this to work similarly in a pyproject.toml might be because of the /// pyproject marker. To which a possible answer could be: stop naming it /// pyproject, use something like /// pyscript instead.

3 Likes

Honestly, I just wanted to see if there was any consensus from folks on a preference. I had not thought about making this about just PEP 723 since my hope has always been for there to be commonality between PEP 723 and pyproject.toml, but the way this discussion has gone, it suggests people are actually okay having a PEP 723-specific solution. If that holds then we can view the poll as PEP 723-specific (I’ll update the opening post accordingly).

If we decide this is the way to go, then we could rename # /// pyproject to something like # /// run if it would represent an implicit [run] if we ever put what’s in PEP 723 into pyproject.toml. And that could become what a hypothetical, future PEP proposes. But the name change would also help make clear that while we are taking TOML for a common syntax, it is technically separated from pyproject.toml.

5 Likes

Option 2 is the simplest and keeps the door open for maximum flexibility over time.

I agree with @pitrou that when things get more complex it is time for the user to create a package.

3 Likes

I voted for 3 because I also thought this was mostly an esthetical issue, and because 3 seems closer to what is already done in pyproject.toml. But also, having a redundant tag here like “[project]” or “[run]” (neither of which I find particularly clear or appealing), seems to me to give more flexibility for future changes (suppose someone later wants to add a different kind of metadata… :man_shrugging:)

+1 from me on

# /// run
# requires-python = ">=3.12"
# dependencies = ["requests", "textual>=0.44.1"]
# ///

As I said above, I would have been perfectly happy with this as a “compromise proposal” for PEPs 722 and 723. I assume that if we take this route, what happens in pyproject.toml would be discussed and agreed separately.

I know @ofek has said he has limited time - if this format is agreed as the final resolution for PEP 723, I’d be happy to help out with revising the PEP text, if needed.

17 Likes

I agree. Dependency specifications are about requirements, things that must be present for the Python code in the project to even successfully run. It’s totally sensible to additionally want to define “environments” (or whatever we want to call them) that impose additional constraints on package versions, but those are still layered on top of the base requirements to actually run the code.

Note that this applies equally well to any dependency, not just Python. If my code uses features of libwhatever that were introduced in version 2.8, then a version >=2.8 must be present to run my code at all. If I want to define some special environment that uses a higher version, that’s cool, but that doesn’t change the fact that the requirement is >=2.8. That is why it makes the most sense to me have these requirements specified only once (i.e., not duplicated in [project] and [run]).

Incidentally, this yet again suggests to me that package management and environment management should be considered jointly. :slight_smile: That is, although package requirements and environment definitions are two different things, users will often want to derive environment definitions by combining the inherent requirements of a particular collection of code with additional constraints (like “I want to run this on readthedocs”). This will be easier if both requirements and environments are managed by the same tool. (Whether their metadata is in the same file is another question.)

I don’t really agree with that, and I see this is as one the deeper issues that makes packaging frustrating. The problem is that “creating a package” currently involves all this metadata and boilerplate and navigating a bunch of tooling, and even building and then installing a wheel. I think the experience for many users would be much nicer if there were a smoother workflow for “develop first, package later”, where it’s not necessary to engage the full packaging apparatus unless you actually want to publish a package. This would also ease some of the other use cases discussed in the “projects that don’t want to build a wheel” thread, like “send someone a zip file with your code”.

It’s possible that some of these needs would be met if the ecosystem became less fragmented, but I don’t see any signs of that happening with or without any of the proposals discussed on this thread.

My perspective on that is mixed. I agree that we may be rushing to propose something that’s not fully thought through, but personally I’d rather we just kick script dependencies down the road too until we figure out how it fits into these other use cases. I don’t think users are so fragile that they’ll panic just because script dependencies, which haven’t existed in Python for 20+ years, remain nonexistent for a while longer.

That’s a good point. I would hope we wouldn’t paint ourselves into a corner. But in my view that is what has already been done to some extent by creating a metadata format that uses generic words like “project” when what it really means is specific things like “wheel” and imposes specific constraints like the directory-based-ness that you mention. What I take from this current situation is that it is important to consider the widest possible range of use cases before baking anything into a fixed format.

Anyway, I voted for Option 2, because it seems most consistent with what I said above about not repeating dependencies, but all of these options maintain what I see as problematic aspects of the current metadata system (for instance, the need to treat Python specially rather than as just another dependency). I don’t see a huge difference between these options to be honest.

3 Likes

I suspect this reasoning inadvertently conflates “package” in the import sense with “package” in the PyPI distributable sense. It’s unfortunate IMO that the term is overloaded this way.

3 Likes

I’d rather we just kick script dependencies down the road too
until we figure out how it fits into these other use cases. I
don’t think users are so fragile that they’ll panic just because
script dependencies, which haven’t existed in Python for 20+
years, remain nonexistent for a while longer.

Perhaps a subtle distinction to some, but it’s not that script
dependencies don’t exist (they do and have for a very long time),
nor that there aren’t already ways to encode them into scripts (lots
of us have been doing that for ages too). What’s been requested is
that the Python community standardize on a solution for that, so we
don’t all keep doing it in a dozen different and mutually
incompatible ways.

More tools are now adding support for script dependencies, and
their authors and users would rather those new implementations tried
to achieve mutual compatibility, but they’re looking to a standards
body to hopefully tell them what “compatible” looks like. Otherwise
they’ll just go ahead and add to the pile of extant random
solutions, which you’re right isn’t substantially worse than the
present situation, but it’s not great either.

2 Likes