Adding a non-metadata installer-only `dev-dependencies` table to pyproject.toml

This would be functionally similar to PDM’s development-only dependencies or Poetry’s dependency groups.

These are development only “dependencies” of a package, that aren’t supposed to be exposed to end-users as extras (i.e. not belonging in project.optional-dependencies) while still allowing maintainers to install them during development.

Is there any apetite in trying to consolidate this concept, and adding a TOML table similar to optional-dependencies; except that they’re only available within development worflows and not exposed to end users?

[project]
...

[build-system]
...

[dev-dependencies]
test = ["pytest"]
doc = ["sphinx", "furo"]
7 Likes

What are the downsides to having these in optional-dependencies?

5 Likes

Sounds interesting, but I am trying to understand how does it integrante with the existing workflow…

In the case of optional-dependencies, the build backend will translate the requirements to core metadata… How would dev-dependencies work?

Note also that poetry plans on deprecating dev-dependencies in favour of its more general dependency group mechanism, so standardising this might actually be a step back for poetry.

Edit: I misread, pradyunsg cleared things up below, my apologies for the confusion downthread.

1 Like

That they’re exposed to end users, as dependency metadata. You can pip install package[tests] which… is dumb; and occuldes the fact that pip install package[feature] is something conceptually very different.

They aren’t a part of metadata provided by build backends; they’re only for the project’s own development workflows.

What’s proposed here is equivalent to Poetry’s dependency groups model.


FWIW, one outcome from this could very much be that we all agree that it’s fine to use optional-dependencies for this; in which case, we should have installers (like pip) allow installing only those dependencies; and not the package itself.

3 Likes

In this case, how does it integrate in the workflow? Is this something pip will parse/handle itself?

This doesn’t affect build backends in any way (see again: “installer-only”, “aren’t a part of metadata”).

And, yes, installers such as pip would be expected to provide interfaces to install these requirements. The fact that Poetry, Hatch, PDM and others have implemented these mechanisms is (IMO) a clear indicator that this is a useful thing to have in user workflows.

A conceptual model of this would be… well, this is like having a docs/requirements.txt or a tests/requirements.txt; which describe completely different environments for performing completely different development tasks; some of which do not require installing the package itself.

1 Like

I’ve gone ahead and updated OP, to clarify some of the details and spell out what the links are to.

Some pip feature requests, demonstrating that there’s user apetite for something like this (I spent less than two minutes looking for these):

So I would suggest that the starting point is for Poetry, Hatch, PDM and the others to collaborate on a common interface/API, which could initially be located under [tool.shared]. That doesn’t require any sort of standardisation effort or consensus (beyond all of the tools agreeing). Once that’s established, it would be pretty straightforward to “promote” the data to a top-level standardised key, if people feel that’s worthwhile.

One thought though - handling a key like this is basically trivial in a standalone script:

def install_deps(name):
    with open("pyproject.toml", "rb") as f:
        pyproject = tomli.load(f)
    reqs = pyproject["tool"]["shared"]["dev-dependencies"].get(name)
    if reqs is None:
        raise ValueError(f"Invalid dev dependency name {name}")
    cmd = [sys.executable, "-m", "pip", "install"] + reqs
    subprocess.run(cmd)

Given this, I’d prefer to leave the handling of this feature to “workflow tools”. I don’t think it’s in scope for installers in general (and pip in particular). If someone published the above script for developers who don’t use a high level workflow tool, that would be fine, and would avoid the need for pip to support this.

At some point, I do think that pip needs to make a firm decision on whether it’s a development workflow tool or just an installer[1] - we’re currently in an uncomfortable position of having the same scope debate every time discussions about supporting development workflows comes up. But that’s a separate question, and is something the pip devs can discuss privately.


  1. And as a consequence, is competing with hatch, PDM, etc, or is providing low-level support for them. ↩︎

3 Likes

Depends and have you view extra. I personally never seen it as a feature enable functionality, that is rather just one use of it. Paul is right, this doesn’t just impact installers but also any workflow orchestration tool that inspects extra (tox, poetry, pdm, etc). I personally am -1 adding a new variant of extra.

3 Likes

I absolutely love that PDM supports this feature already, as “dev groups”. And before PDM, I was using multiple txt files in the requirements folder, one for each group. The groups are tests, tox, pre-commit, typing, docs, and dev. None of these should be exposed as pip-installable to end-users of Flask, they’re only relevant to contributors.

3 Likes

Meaning they would not land in the metadata inside the distributions (sdist and wheel)? Would there be more to it?

If something is done regarding dev-time dependencies, I guess I would expect a bit more features so that a much more solid interface towards CI/CD systems, IDEs, tools like tox, and so on can be achieved. Otherwise it feels like the whole thing could be solved by just agreeing that, for example, all extras with the prefix dev- are not added to the distribution metadata.


Suggestions off the top of my mind:

  • Some kind of marker so that the IDE (or whatever other 3rd party tool) knows if the group of dependencies in question should be installed within the same virtual environment as the project or not (static linters, code formatters vs. test runners for example).
  • Some reserved names with specific meaning (test, lint, format… or a whole prefix x-), so that tools (IDEs) know what the purpose of this group is, and possibly handle it in a particular way.
  • A different notation that would prevent repetitions, for example 'libA>1.2.3' in ['test','lint'] instead of 'test' contains 'libA>1.2.3'; 'lint' contains 'libA>1.2.3'.

One thing I “lost” moving from requirements files to PDM’s dev groups is that GitHub doesn’t list my project’s many dev dependencies in its dependency graph anymore. If we add dev groups as standard pyproject.toml metadata, that would give GitHub a way to support those in their dependency graph.

3 Likes

Ditto for Hatch. As I’ve expressed before, these non-runtime extras are attempts at describing separate environments without an actual environment manager.

1 Like

This fundamentally is a duplicate of Providing a way to specify how to run tests (and docs?)

My response therein: Providing a way to specify how to run tests (and docs?) - #49 by ofek

I don’t think it is, or at least what’s being discussed here was just part of that other topic.

As an example, let’s take packaging (because I bet Expose test dependencies outside of `noxfile.py` · Issue #601 · pypa/packaging · GitHub sparked this idea for Pradyun :grin:). It uses nox, so it has its requirements to run its tests embedded in its noxfile.py. Having those dependencies reachable by nox is important for running the test suite against multiple versions of Python.

The problem is that I also want to be able to run the tests while I’m doing development in VS Code using its test explorer (it’s way faster to run a single test under my virtual environment when trying to get that one test to pass). Right now, though, I have to manually extract the requirements strings from noxfile.py and run the pip install command manually in order to get the appropriate projects installed to run the tests (on top of doing an editable install thanks to the src layout, and if this was some other project probably to install normal dependencies). That is not a great development workflow to ask of contributors to packaging to go through if they want their editors or other tooling to somehow participate in the development process.

The options I’m seeing listed here are:

  1. Standardize in a new way, although it sounds like Hatch and Poetry don’t love that.
  2. Leave it to the tools, which may or may not work depending on whether tools want to support this concept.
  3. Use extras as-is, which some people don’t like as they view extras as part of the API of their package in a way.
  4. Use requirements files, which isn’t a standard and leans into pip pretty hard for this.

But I don’t see this as something we can just leave up to task runners as that cuts out a legitimate workflow IMO.

1 Like

I’m surprised you’re advocating for this instead of your other idea Support a way for other tools to assist in environment/interpreter discovery · Discussion #168 · brettcannon/python-launcher · GitHub

If that’s the only concern for extras can we hide some extras under some dev flag? I don’t personally mind having these in extras.

I get why @pradyunsg thinks using optional-dependencies for this is “dumb” – it feels weird that a user could install tests as an 'extra" – but my impression from the maybe-not-a-duplicate-topic was that a lot of people maintaining downstream package managers actually use this to run tests (edit: without building and inspecting an sdist)?

Like @CAM-Gerlach was saying here:

Would they lose this ability if there was a dev-dependencies table then?
Sorry if this is just revealing my ignorance about what one can get out of a distrib package