Sorry, I didn’t know what you were referring to “up-thread” (I’m still not sure) so I assumed you were referring to your own proto-PEPs.
I think all these different proposals need to be considered in isolation–bringing up arguments from other proposals muddies the waters and it’s why previous threads became too complex for people to follow.
I for one would be very happy to see the “[project]” table expanded/modified as @brettcannon , @jamestwebber , and @kknechtel are describing.
Hi and welcome! And thanks for mentioning this - I’ve found it to be a consistent pain point when working on isolated / air-gapped networks (that are usually very unlikely to have a local package repo.)
I think I haven’t expressed it in this thread yet so I will restate my position a bit stronger this time: I am a hard -1 on this, Hatch will never support this, and I will continue to actively discourage embedding lock files in pyproject.toml files anywhere that I can in writing. Proponents of this don’t realize how few people want that file to be hundreds of lines long.
As for the rest of the proposal, I don’t yet have a comment because I haven’t finished reading this thread. As always, I appreciate the effort here!
It’s possible to specify a URI for the dependency, and URIs here are defined by rfc3986. So as far as I can tell, it should be possible to do something like "my_requirement @ file:///path/to/my_project/dist/my_requirement-1.0.0-py3-none-any.whl".
What Matt might be mentioning is local editable installations. That is indeed not covered and is a highly requested feature. I will be implementing that in Hatch in the spring which will basically copy Cargo workspaces.
That’s a good point; I got a little bit carried away suggesting lockfile data being stored here. When you count multiple artifacts for different interpreters and platforms, it balloons very quickly into something huge.
Let’s just consider that a think-o by me and spill no extra ink over it here.
This thread is covering a decent amount of ground in terms of alternatives to the proposal. I’m happy to try to write alternative PEPs, but I’m pretty sure I’m not qualified to do so on some topics – e.g. loosening the definition and restrictions on required metadata fields seems like a very different kind of spec.
Based on current feedback, I plan to take some time tomorrow to remove the object-specification for dependencies (at least for now; it could always be reintroduced later) and the relevant compatibility requirements. The feedback on that aspect of the proposal is trending negative, and it’s relatively simple to just remove it and put it in the rejected section.
This recent point about local path deps doesn’t feel fully addressed to me. IIRC, file URIs can’t express relative paths. Since the proposal is already defining a bit of new syntax, perhaps this case could be included as well.
If you do that, you have two conflicting requirements - the need for “more capable” dependency specifiers (such as relative local file references) versus the fact that dependencies in extras must be limited to PEP 508, by design. That’s a very subtle distinction (as evidenced by the number of people who ask why they can’t reference a sibling project in an extra or a dependency) so it’s going to be very difficult to explain to users.
My gut instinct here is to explicitly leave package dependencies and extras alone, unless you want to get sucked into the complexities involved in the reasons behind PEP 508’s limitations (which to be honest, I’ve forgotten, but I know they were important).
These features[1] are definitely not supported by PEP 508 specifiers, but are supported by requirement files.
It’s worth being clear here - the pip-specific features of requirement files are not just the options. Pip allows more than just PEP 508 specifiers as arguments to a pip install command (and therefore in requirement files). Limiting things to PEP 508 only is an extremely restrictive subset of what requirement files can do.
I take your comment as meaning that in PDM’s experience, a feature that’s limited to only PEP 508 would be insufficient to cover the use cases you see. Thanks - that’s very valuable feedback.
Please think about “How to teach this” if you do that. We currently have PEP 508 (a standard), and requirement files (“anything that can be passed to pip install”). You’re now adding a third concept to the mix, with its own peculiarities and limitations.
What gives you that impression? I’ve said this a couple of times already, but under the current proposal for --only-deps, it will fall back to calling the build backend if the dependencies can’t be determined statically. So it absolutly assumes that the project can be built into a wheel.
You still have the problem that project.dependencies and project.optional-dependencies can only hold PEP 508 requirements. So they won’t be sufficient for PDM’s use cases, for example. And before you say “let the consumer choose what is allowed”, at what point have we loosened the spec so much that it’s useless? We still need to document how a user writes a pyproject.toml file, and if we’re repeatedly saying “it depends on your tool”, we’re simply worsening the problem of “no clear guidelines” that users are (rightly) complaining about.
Just to provide another user story: At work, we use a template for Python projects to provide a consistent developer experience across projects. Many of our developers do not mainly program in Python and don’t care about which of the many tools in the Python ecosystem we use. We therefore abuse extras to provide a workflow where we can tell people “If you change something, run tox run -e lint/test/docs/build and make sure that the output is green”, while still maintaining all dependencies in one place. Setting up a development environment usually just means running pip install -e .[dev] (dev contains lint/test/…). This leads to a very nice developer experience but requires installing the package with all its runtime dependencies, although we just need to install build to build a wheel.
This PEP looks very promising to me but I am not clear on a few things.
From my understanding, three types of dependencies have to be considered:
Why should the last type of dependencies be specified in requirements.packages instead of project.non-runtime-dependencies (actual name tbd)? What’s the difference between requirements and dependencies? I’d find it very unnatural to explain to someone why these things have to be treated so differently.
I’m not sure if it is the right time to start specifying the object definition, like {spec = "numpy>1"}, as it is not clear yet, what the required fields are. The PEP could say that requirement specifiers can only be strings for now, but may be extended to objects in the future. Tools should warn that they don’t support object notation but must not error out.
The object notation is required in case requirements.packages replaces project.dependencies and project.optional-dependencies in the future, right? That seems like a massive migration effort and using something like project.non-runtime-dependencies could prevent the need entirely. However, this assumes that the project table can be used for things that are not strictly related to building distributions, as discussed in Projects that aren’t meant to generate a wheel and pyproject.toml.
Keep in mind the reason that relative paths aren’t yet supported by the PEP 508 dependency spec is because some tools copy the project directory elsewhere before building. pip I think used to do that for a while.
Actually, it’s not just concepts that are proliferating here. Consider a new user[1], who is writing a Python project. They want to list out the dependencies for their test suite and documentation build, so they can automate them using something like tox or nox. This project may or may not be building a wheel. The user currently has two options that I can think of:
Declare the dependencies in extras (named test and doc, for example).
Put the dependencies in requirement files (requirements-test.txt and requirements-doc.txt).
Option (1) is awkward in cases where the user doesn’t want to install the project itself. There’s a pip install --no-deps option being proposed to fix that, but it still expects the project to be structured so it can be installed as a wheel. Option (2) is pip-specific, and doesn’t play well with tools like PDM, poetry or hatch.
We’re now proposing a third option, dependency groups.
That’s an awful lot of choices for a user who ultimately just wants to test and document their code, and maybe isn’t even planning on publishing it except by making the git repository available to people who want it.
While the existing two options aren’t always ideal, any benefits dependency groups offer need to be weighed against the frustration users will feel from once again having too many choices to make when deciding how to lay out their projects.
or maybe not so new - this is very close to my experience, and I’m anything but a new packaging user! ↩︎
If we’re hearing that relative path support is an important feature for dev workflows in PDM, I don’t think we want to move past this point. Failing to support it could mean that PDM cannot migrate to this feature set, a net loss for the community.
I appreciate the warning that this changes what the spec proposes and “how to teach this”. That’s true, but I think that it needs to evolve.
Specifying this would seem to push support for relative paths onto every tool which wants to support this spec. I acknowledge that as a downside, but I don’t see a clean way to avoid that.
This line of thinking gets some benefit from dependency groups being different from extras. Relative paths can’t be part of your package metadata. i.e. Let’s embrace these not being extras and not being strictly PEP 508. The two go hand in hand.
Suppose we extend the Dependency Groups spec to define “Local Package Specifiers” (better names for the syntax welcome) in a dedicated section of the spec. A Local Package Specifier is one of:
a PEP 508 string
a relative posix path which must begin with .
the above, followed by extras in brackets
$TBD, some syntax for referring to other dependency groups
Intentionally, this is trying to match a subset of syntax which can be used in requirements.txt.
(Perverse idea: we could make the syntax for including other groups -r other. I don’t even hate it. )
Is this taking the issue of local relative paths too seriously? I’m worried it could be a make-or-break issue.
I second this point, various terms in Python packaging have already become a mess:requirement, dependency, package, project, distribution, module…
We should try to use the same word to refer a similar concept. And it seems “dependency” is mostly common used to refer to some packages that need to exist for a particular package to work. We should use it instead of requirements.
I think that argues for moving to [project.dependency-groups].
I have seen others express the idea that we should get more comfortable with expanding the user of the project table for non-packaging needs, so this is in line with past discussions.
Are there any major objections to that name? Is using project for this off limits?
Yes, let’s. I’ve been thinking about Ruby’s Gemfile capabilities for dependency groups, but I’ve never published a gem. I’ll need some time to do some reading.
I mean to look at
ruby
JS/nodejs
rust
because that’s about where my experience lies for languages that might have useful mechanisms.