I think this was already discussed at least a couple of times on this forum. The latest attempt is probably this one: Adding a non-metadata installer-only `dev-dependencies` table to pyproject.toml
It seems like that discussions has gone pretty stale, I feel like this is something that should really be added. I’d be willing to write a PEP if that is was is needed for this to be pushed into being a real feature.
Thanks for working on this. It’s really needed indeed.
Most projects will have more than just dependencies and dev-dependencies. There will be different sets of dependencies for different dev tasks e.g. for building the docs vs running the tests etc. In fact you might want to run tests with different sets of dependencies installed to see that they pass both with and without optional dependencies being installed. This is why workflow tools like poetry and pdm allow specifying named dependency groups rather than just having two sets of dependencies.
I guess if you manage to write an objective summary of the discussion so far, that would be a great way to resume the discussion and that would probably keep the level of motivation required to get back on the topic as low as possible for others. People do not necessarily want to go back and read the whole (long) discussion (again). I know I don’t want to and I would prefer a summary.
Reading only the bottom of that previous discussion, one can notice there seemed to have been some kind of relative agreement on this suggestion:
So if you want to write something (a PEP?) maybe that is a good place to start. And I guess you would need to have some kind of agreement / buy-in (or even better some proof of concept implementations) from at least one or two tools (I guess dev workflow tools such as Hatch, PDM, and Poetry probably make the most sense, but also tox, nox, and “task runners” such as doit, invoke, and taskipy could be very good candidates as well).
It could be made to be either a list of dependencies or a mapping like how
optional-dependencies accepts, so either of these could be accepted:
[project] dev-dependencies = [ "foo", "bar" ]
[project.dev-dependencies] docs = ["foo", "bar"] tests = ["bar", "baz"]
Supporting both I feel will be the best solution to allow both simple and advance uses of dev dependencies.
I think the reason this keeps getting bogged down without coming to any conclusions is that it’s not at all clear that standards are the best way of defining the user interface of individual tools. We’ve had that debate a number of times, and never really come to a good conclusion.
I don’t have a really good solution here either. My standard response is to suggest that all of the interested tools define their own entries -
tool.poetry.dev-dependencies, and so on - and work on defining standard semantics and behaviour. Then users only have to change the tool name to move their definitions to another tool. And once we have a majority of tools all providing the same UI, it would be relatively easy to propose that the shared behaviour gets “promoted” to a standard, and given a top-level name in the namespace.
While that’s at least mildly annoying for the user, it does at least expose clearly that the main issue here is getting the tools to agree on a behaviour. If tools can’t come up with a behaviour they all agree on, it doesn’t honestly matter whether it’s called
tool.X.dev-dependencies, or top-level
dev-dependencies - it won’t be something users can rely on to behave the same whatever tool they use. Which IMO would be more annoying for the user.
I’d suggest that people interested in moving this forward focus less on how the data gets stored in
pyproject.toml, and more on how it works, and getting support in the various workflow tools. That seems to me like a more tractable problem to solve.
And just to state my own position, I can see the benefits of having one or more named sets of requirements (I’m not convinced that a single “dev” is enough), but it’s not something that matters enough to me to motivate me to work on it. For my own projects, I’m happy right now with just using requirements files for the occasional times I need something like this.
focus less on how the data gets stored in
pyproject.toml, and more on how it works, and getting support in the various workflow tools
+1. On behavior, the named workflow tools are all creating virtual environments. The requirements file workflow makes it optional for you to install it on your current environment or you make your virtual environment. Here it’s either the tool or user keeping the metadata of “where is the virtual environment” some where. Then this naturally lead to me suggesting having the workflow tools agree on how users would provide how they want to build their virtual environments to the tool.
To me having just the “dev” or any finite enumeration would not be enough. Dependencies specification alone is not enough. I anticipate others expressing just virtual environment specification isn’t enough — what about scripts?
On workflow tool using virtual environment specification in
pyproject.toml, one awesome
suggestion (by @pradyunsg?) at PyConUS is instead of defining a standard is to have the workflow tool bootstrap on a common tool: say
[tool.venv.dev] then possibly
[hatch.envs.dev.from_virtual_environment] = "tool.venv.dev" (again the layout in pyproject.toml is not important).
What exactly do you mean by “behaviour” here?
To me the specification of a group of dependencies needed for some particular task/environment is something that can be abstracted from exactly what the tools do with that information. Ideally I would be able to access that same list of dependencies from different tools to do different things. Hypothetically maybe vscode has one way of selecting a group of dependencies to create/activate a desired environment and maybe nox has another. I might be using one environment management tool locally but then something else in CI. I might also have e.g. dependabot tracking any pinned versions or I might be using something analogous to pip-upgrader for that. These tools all do different things so I don’t expect the same behaviour from them. It would be nice if they all knew how to read/write the dependency groups as needed though and if that information did not need to be duplicated for each tool.
These dependency groups are properties of my project and not some tool just because I might happen to use the tool for some things. Why does one tool get to claim ownership of my dependency groups?!
So what you want is simply to be able to specify a mapping from names to lists of requirements? I guess that makes sense, but what’s the use case? More precisely, what’s the use case for putting this information in
The point of putting metadata in
pyproject.toml in PEP 621 was so that we’d have a backend-independent way to specify the data, knowing that all backends would treat that data in the same way. What I’m trying to say is that we don’t yet have a clear understanding of what it would mean for “all tools treat the data in the same way” in the context of named lists of dependencies, so the argument for putting that data into
pyproject.toml isn’t as strong.
To be honest, I don’t see the point of putting this data in
pyproject.toml unless there’s an expectation that some tool will do something with it. So I’m struggling to understand why people are so uncomfortable with the idea of writing down what they expect a name → requirements mapping to be used for. Unless it’s because there actually isn’t as much consensus as people think there is…
One example I can think of that could profit from standardization around groups of (development) dependencies:
How do I tell tox to install Poetry’s dev-dependencies in the test environments?
I guess that makes sense, but what’s the use case? More precisely, what’s the use case for putting this information in
The information is already being put into
pyproject.toml, but every tool has to define its own way of putting it there. Instead, we could have a standard place and format for this information, so that it can be understood regardless of the specific tool being used
From my perspective that only thing that really matters is to actually have a way to define them. I would not try to aim for much higher than what current regular dependencies are. There is always a way to make this more complex later on.
Poetry, hatch, pdm and now rye all have their own way to define this and it’s getting weird
OK, I guess I have no objections to this idea then. I’m not sure how it gets taken forward - there’s been a lot of discussion in the various threads on this topic, but no particular conclusion or consensus. Maybe someone needs to just write a PEP and solicit comments/support from interested parties (in particular tool vendors with an existing equivalent, and installers like pip).
I don’t have anything else to add, really.
Tools always can feel free to get together and decide on sharing whatever section without involving a standard.
[tool.dev.dependencies], for example.
It would be nice to see this happen but to me it does not seem to be the way that things typically progress. I don’t see much collaboration on interoperability that is not directly related to standards like PEP 517.
Push those tool authors to do it!
Why push the tool authors do to this and not the official spec itself? having it part of the official spec is 100x better as it allows for one proper way to do it rather than tools needing to support one official spec and other 3rd party ways, this is why the pyproject format exists to combine the existing formats into one coherent format.
I like the enthusiasm!
Because the official spec is useless if the tool authors don’t follow it. It’s much easier to add an official spec for something that already de facto happens because the tools already support it. Ideally the standards would follow established practice in the first instance rather than drive it. Of course often that approach doesn’t work…