PEP 735: Dependency Groups in pyproject.toml

Yes, I suggested way back in a separate topic that Stephen worked off of.

I’d be confused if types was an extra that was not, in fact, “extra” (i.e. it was always required).

1 Like

Oh, right : D

So if we were to pursue an idea like this:

[project]
dependencies = ["httpx[brotli]", ".[types]" ]

then obviously brotli would be an extra and types would be a group.

I would absolutely not interpret types as a group in this case, because this uses a syntax for extras and groups aren’t to be exposed through the extras public API. Moreover, what if there’s an extra called types and a group called types? . [types] means “all the default/unconditional runtime dependencies plus all those things in extra”. Dependency groups have a different semantics since they aren’t supposed to pull in the base package deps like that.
This all is to say that there’s nothing obvious about the perceived relation of types to dep groups in this example. If anything, types is obviously an extra, especially since it’s a PEP 508-compliant string that is defined as such. I’m sure most people are used to and are trained to recognize this as an extra in various interfaces.

Although we did entertain the idea earlier in this thread, I’ve come to be against a string syntax for includes.

The main issue is that it must be mutually exclusive not only with PEP 508 but also with any future string syntax for dependencies.

I want to work on path dependencies in the future – IMO it’s an important topic to explore – and I don’t want us to be forbidden from using .[foo] to mean “current directory with the foo extra”.[1]

I also think that .[foo] looking like an extra is a downside, rather than an upside, since you could have an extra with the same name as a dependency group but slightly different contents.


Is there some significant upside to a string syntax that I’m missing?


  1. no commitment to that being a valid path dependency in the future proposal, of course ↩︎

2 Likes

I don’t think so.

Hi @sirosen, it seems like you are pretty determined to push for this. I’ve expressed some concerns higher up which I won’t repeat here. I’ll just ask if you can please clearly include this topic in the backwards compatibility section of the PEP.

One thing I will point out is that this seems quite incomplete to me. You seem to assume that only build backends use this data, and once they do the work to add support for the new feature and cut a release, then the problem is solved. However, that is not the case. This metadata can be read by a host of tools when it is static. Static metadata is preferred, as expressed in the motivation section for PEP 621:

“To speak specifically to the motivation for static metadata, that has been an overall goal of the packaging ecosystem for some time. As such, making it easy to specify metadata statically is important. […]”

It can get used not only by build backends/frontends, but by tools like dependency analyzers (SBOM tooling, GitHub’s dependency list, Tidelift, libraries.io, etc.), by distro tooling like Grayskull/pyp2spec (see PEP 725 for more detail on that), as well as by custom dev/analysis scripts and tools. When a package starts using {include = ...}, that is all likely to break.

To give one concrete example of that, look at Dependencies · pypa/build · GitHub. GitHub extracts the dependencies from pyproject.toml, as noted by text below each dependency like “Detected automatically on Jan 17, 2024 (pip) · pyproject.toml”. On that same page, it has an “Export SBOM” button. It’s probably not the highest-quality SBOM generator, but anyone who uses it may see dependencies silently go missing, until GitHub adapts to the change (and it took years for them to start supporting the current format, so that adaptation may take a while).

Final thought: you also expressed that you want dependency groups to be extensible in the future. Can you please think about how to obtain that extensibility without the risk of breaking users of dependencies for a second time if such an extension does occur?

4 Likes

Sorry, my fault, it was a bad example. The point was not on using square brackets specifically, but on using string syntax. It could as well be something like dependencies = ["httpx[brotli]", "{types&stuff}" ] or whatever.

It might make it easier to use on the command line and all sorts of other places were it might not be possible to use TOML syntax. There would be one syntax for a whole bunch of contexts.

For example, my hope would be that I could write something like the following with any installer and it would have good chances of working without needing to learn the specific option flag used by that specific installer:

${INSTALLER} install 'httpx[brotli]' '.{types&stuff}'
1 Like

I don’t think this needs to be part of the PEP, though. The PEP specifies how this fits into pyproject.toml, it doesn’t need to constrain anyone’s interface. A future PEP[1] could standardize that if desired.

That is to say: it’s possible for ${INSTALLER} to accept .{types&stuff} as the way to specify dependency groups even though that isn’t valid inside the TOML file itself. If the community standardized on that notation, another PEP could update the specification.


  1. or a PyPA decree, or something ↩︎

2 Likes

The point is that such a notation would be usable (and used) in as many contexts as possible to reduce the risk of having to learn multiple notations. So if right from the introduction of the very concept, the notation is not used, then the goal is completely missed.

1 Like

I suppose my perspective is that agreeing on the proper notation for this seems difficult, and I don’t want that to block or delay the meat of the PEP, which is more important (IMO). Defining that shorthand is not necessary for this to work.

4 Likes

I read through much of the thread and the PEP itself, so apologies if I missed it, but is there a clear write-up anywhere on why dependency groups are necessary in lieu of a single section for development dependencies? I get that Poetry and PDM support these more granular groups, but that just leads me to ask why it’s necessary in those tools. Is it just about performance, and avoiding installing more dependencies than are necessary?

To clarify the motivation behind my question: with all respect to the work that’s gone into the PEP thus far, I worry that the desire to make the concept general ends up shifting a lot of complexity onto the end-user.

Imagine from the perspective of a non-packaging export: we’ll now have dependency-groups along dependencies, but they’re totally different things, and my development dependencies are supposed to under dependency-groups but my non-development dependencies are not. And dependency groups aren’t installed but are also different from optional dependencies / extras.

Personally, I would put a significant premium on being able to use “development dependencies” as the core concept, the terminology, and so on (imagine, e.g., project.dev.dependencies and project.dev.optional-dependencies for extras). It would be intuitive for users and maps precisely onto the way that Node, Rust, etc. manage development dependencies. So if the PEP is not going down that route, I’d love to see strong motivation for it.

2 Likes

Here’s why and how I use separate environments in Flask and other Pallets projects. We use pip-compile to pin multiple environments: flask/requirements at main · pallets/flask · GitHub. We use pip-compile and not PDM because dependabot cannot update PDM lock files, and not Hatch because hatch doesn’t have lock files. On a side note, really looking forward to a lock file standard that tools and dependabot all work with.

In CI and when running envs locally with tox, time and only installing the exact libraries for that environment and not others was the goal of having different environment pin files. Is this actually useful? I don’t know anymore. It probably was when I first started, when the resolver in pip was slower and caching pip with the setup-python action didn’t exist yet. Nowadays, maybe I could just use a single environment. I also haven’t tried switching to uv yet to see how that would affect things. This is mostly a case of the momentum of a working solution, and not having enough time to experiment on top of all the other stuff I have to do.

We also have a build.txt pin file that pins only the build dependency for our build and publish to PyPI workflow. In this case, installing anything else besides build is overhead. Again, does this matter? Not sure.

We also have a dev.txt that -r depends on (most of) the other pinned environments and adds pre-commit and tox, so contributors can run all the tools locally. (This could potentially cause issues if two environments managed to solve and pin different versions of the same library, since they would conflict when combined in dev.txt, but in practice this has never happened.)

Despite this somewhat complex setup, our instructions to contributors end up being simple, equivalent to “one dev dependencies group”:

$ python3 -m venv .venv
$ . .venv/bin/activate
$ pip install -r requirements/dev.txt && pip install -e .

One other thing to consider (not sure if it’s actually an issue with this proposal/discussion) for us is that the pip install -e . after the dev dependencies is important. We develop Click and Jinja, which are used by our own development tools, so order matters otherwise the local version ends up overwritten by the pinned dev dependency.

Some of our libraries also have minimum required dependencies (abc>=x.y) to test against, or test against dev versions. In this case, we also want to be able to pin for those specific envs: flask/requirements-skip at main · pallets/flask · GitHub and ignore them for dependabot.


The above is for the case of library development, but I use the exact same setup developing applications for work as well, except with an extra base.txt environment from pip-compile pyproject.toml.

3 Likes

That’s why I use the feature with poetry, yes. I have, for example, a “sphinx” group which installs only packages needed to build the documentation, and this group is used in a GitHub Action to generate the docs website.

4 Likes

I don’t use this sort of group in existing tools (because I don’t use those tools :wink:) but for me, it’s about organisation. Partly that’s “not installing more than necessary”, and partly it’s “naming things to make them easier to remember/reason about”. So I want doc dependencies so that I know what’s needed to make the docs, test dependencies so I know what I need to run the tests, etc. Having just one big bunch of “dev dependencies” means that I have to keep that information some other way.

Maybe I’m unusual, but I don’t like the idea of having a semi-permanent “dev environment” for my project, which I need to maintain alongside the project. Instead, I prefer to use short-term, throwaway environments dedicated to a particular purpose. So for standard tasks I’ll use nox (where the “build docs” task will install the “docs” dependency group, etc). I will typically have a project .venv, but it’s a working area, which would normally just have my project installed (and maybe libraries I’ve been experimenting with as part of development). I would absolutely not rely on being able to run pytest or sphinx in my project environment.

So for me, I’d never use a “dev dependencies” group that was intended to have everything in it. But I would use more granular dependency groups, one per task, for organisational purposes. Yes, speed is relevant (if I’m using throwaway environments, rebuilding them fast is important) but that’s secondary.

4 Likes

Hey all!

Personally, I currently use poetry, and the dev dependencies are helpful to me. I definitely see the use cases for multiple dev environments as well. However, I would agree with Charlie - the naming is confusing. dependency-groups makes sense if you read the PEP, but not if you’re just coming into python and exploring these files. My first impression would be that dependency groups are for extras. I think that dev-dependency-groups would be much clearer, though there’s probably some more ways to workshop it.

1 Like

Whereas I think dev-dependency-groups is just too prescriptive about what these can be used for–they don’t need be dev related. They can be used for extras (or just to organize the default dependencies, for that matter). The generic name dependency-groups seems both accurate and precise.

4 Likes

The answer is probably in the issues (feature requests) of their respective GitHub projects. I could find this one, which also mentions a similar feature in Ruby ecosystem:

1 Like

To clarify my own understanding: I don’t see this in the current version of the PEP, but there’s a lot of discussion in the thread around it being included. Is it still the intent to include this in PEP 735? (Or am I misreading the PEP?)