On behalf of my co-author, @Lecris and sponsor @FFY00, I’d like to announce PEP 808: Partially dynamic project metadata. You can see a rendered version here.
This PEP proposes allowing list and table dependencies to be listed statically and also to be extended by the build system if they are present in the dynamic table. So you can now do this:
[project]
name = "partially-dynamic"
version = "0.1.2
dependencies = ["numpy"]
dynamic = ["dependencies"]
Your build backend would then be allowed to add dependencies for you. Removing or changing an existing static entry is not allowed. You can see a lot of possible uses, and interactions with some other PEPs, in the PEP. There will also be a dynamic-metadata package that build backends can use to provide dynamic-metadata, including partially dynamic metadata, that is easy for users to configure, based on the mechanism inside scikit-build-core.
Naive question from someone not as well-versed in all things packaging: does dynamic project metadata not then also imply that you’d have to invoke the build backend during the locking phase, which would then make a pylock.toml environment specific? This seem at odds with the goal of having a lock file?
This doesn’t change dynamic pyproject metadata, it just allows you to put the static part of your dynamic metadata in the normal spot instead of forcing all of it to come from the build backend. So it’s making more metadata static than before.
For your question, it depends on if the SDist METADATA is dynamic (which is a different concept, though you can’t have dynamic METADATA without it also being dynamic in pyproject.toml). If the SDist metadata is not dynamic, then any possible wheel will have the same metadata. Version cannot be dynamic METADATA (even though it is allowed as dynamic metadata in pyproject), the others are choices by the backend supplying the metadata.
If it’s listed in METADATA as dynamic, then yes, every wheel could have different metadata. That’s true today. If not, then you only need to trigger it once, and then you can use that across all environments.
I actually find that it’s easier to think of this the other way round. Rather than it being “partially dynamic metadata”, it’s just ordinary “dynamic metadata” with some of the content being declared statically. The important point is that this proposal doesn’t change anything for static metadata. Rather, it allows some dynamic metadata to be slightly less dynamic.
I would actually like to see this reflected in the “How to teach this” section of the PEP. The way (IMO) this feature should be taught is to say that if you currently have dynamic metadata, but some of the values are the same for all builds, you are now allowed to make that explicit by putting the static part of the metadata into [project], while still marking the metadata item as dynamic.
As the PEP stands, I think it suggests that you might want to change your existing static metadata items to make them extensible. And that in turn leads to the footgun of people making metadata items dynamic “just in case they need extending in the future”.
I think there are backwards compatibility implications for dependency analyzers and similar tools which bear stating explicitly. Currently, a tool can look at project.dependencies and if it is populated at all it is guaranteed to be complete. So we may have code which looks like this:
if pyproject_data["project"].get("dependencies") is not None:
static_metadata_path()
elif "dependencies" in pyproject_data["project"].get("dynamic", []):
dynamic_metadata_path()
else:
no_dependencies_path()
Now code should be updated to check project.dynamic regardless of whether or not static values are available:
if "dependencies" in pyproject_data["project"].get("dynamic", []):
dynamic_metadata_path()
elif pyproject_data["project"].get("dependencies") is not None:
static_metadata_path()
else:
no_dependencies_path()
Note how the order needs to change for correctness. This PEP invalidates a previously safe assumption code could make.
All that said, I have no problem with it! I think it’s a tradeoff worth making. Only a small class of tools are affected.
Static tooling can now detect the static dependencies.
I’m unsure how useful this will be in practice. In uv, for example, we would need to treat this as equivalent to dependencies being “fully dynamic”, since we can’t do anything useful with a partial list of dependencies. Do you have an example of a tool or use-case that would be able to make use of this metadata, i.e., that no longer needs to query for the dynamic metadata when it would’ve had to do so before?
For example, let’s say you want to allow an imaginary build backend (my-build-backend) to pin to the supported build of PyTorch.
The PyTorch pinning problem is real, though I think this proposal (as a solution) may have some weaknesses. For example, say you have torch==2.7.* in your project dependencies, then build a wheel that has torch as a build requirement (this is common) and, after the build process, emits torch==2.9.0 in its augmented dependencies list, since the build frontend resolved torch to its latest version then built the wheel against that torch. In that case, I think resolution would just fail? So you need some other mechanism to indicate that you want the torch version used at build time to match the torch version you’re planning to use at runtime. (We implement this today in uv with match-runtime = true. I’m not suggesting that’s a perfect solution, only that something like that would still be required to achieve the desired user experience, AFAICT.) It’s certainly an improvement that the built wheel can encode the required torch version though, and what I’m describing is more about the broader resolution problem than single-library wheel-building.
(The other thing to note about torch is that I believe their plan to solve that problem is via the introduction of the stable ABI which (IIUC) should decouple the built-time and runtime torch versions, so it might be helpful to have another example of dependency pinning (or, relatedly, an example of why you might use dependency augmentation that isn’t narrowing the version of a statically-declared build dependency).)
(My guess is that you’re right and some tools make this assumption. For posterity, I did check and I believe we handle this correctly in uv.)
There’s nothing incorrect about the behaviour @sirosen described. The current standards are explicit that if a field is given a value in pyproject.toml, that field cannot be in dynamic:
Build back-ends MUST raise an error if the metadata specifies a key statically as well as being listed in dynamic.
PEP 808 is a backward incompatible change that alters that rule - the “Backward Compatibility” section notes that users must ensure that a PEP 808 compatible pyproject.toml uses a build backend that supports PEP 808. But it doesn’t discuss the impact on tools that are not build backends, but which are relying on the assurances currently given in the spec.
I agree with @sirosen that this isn’t an unreasonable change, but it is a change, and that should be acknowledged. It shouldn’t be hard to add something to the backward compatibility section that covers this - it’s possible to write code that works under both the current rules and the PEP 808 rules, so I don’t think we need to do anything more than explain how to do that. Tools that don’t support PEP 808 will work incorrectly if handed a project that uses PEP 808, but that’s what “doesn’t support PEP 808” means, so it’s hardly a disaster.
I think it will be useful, just not so much for installers. Tools that cannot deal with dynamic dependencies can be made more accurate, e.g. dependency analysis tools like https://deps.dev/ and https://libraries.io.
It will be useful for build backends as well; @henryiii already mentioned scikit-build-core so let me add that meson-python has such needs as well, e.g. for dependencies (issue, draft PR), SBOMs (issue), and license files (no open issue, but will allow adding a feature to prevent having to do things like this (numpy ad-hoc license concatenation).
That metadata is clearly incorrect, so resolution should fail.
The PyTorch stable ABI will take a couple of years to propagate to all its dependencies I expect, and some of those may never be able to use it (for the same reasons as not all Python packages with extension modules use the CPython stable ABI) so that example is still relevant. There are other examples too, like the PyArrow team wanting to split up their build into a couple of wheels, or previously NumPy’s C API usage.
Agreed with the value of adding an example that isn’t narrowing - that will be more broadly applicable than the C/C++ ABI use case.
Hmm, I don’t think the metadata is incorrect? The case I’m describing is: you depend on torch==2.7.* and flash-attn, and flash-attn has a build dependency on torch (unpinned). When you install your project, it has to go build flash-attn from source, and the build frontend resolves to PyTorch 2.9.1, augmenting the metadata with torch==2.9.1. So the resolver sees that you now depend on torch==2.7.* and torch==2.9.1. But if the build frontend had “known” about your runtime dependency on torch==2.7.*, it could’ve built a wheel against PyTorch 2.7.1 and the resolution would’ve succeeded.
Ah you were talking about the build and runtime dependencies on torch being in two separate packages, that wasn’t clear to me from your initial description. In that case yes, 100% agreed that’s not incorrect and is a real world pain point today.
Does anything prevent build frontends from passing the original installation requirements through to the backend in such cases? e.g., PIP_CONSTRAINT
I understand that there’s no provision for this in the PEP 517 hooks. I think a new optional hook would be needed to standardize such behavior, but I’m wondering if it’s even possible to do it with some hackery today.
EDIT: sorry! Immediately after posting I realized that this doesn’t work because there’s no guarantee that the install requirements are even compatible with the build environment. And for most cases they aren’t relevant.
I’ll put those in draft until the person who suggested the change approves it / comments on it.
Yes, installers generally might need to run the hook, but things like GitHub’s dependency graph would find it much more useful to know some dependencies. You could even help them by listing an unpinned package here, even if an unpinned package is injected later. Those sorts of applications can’t run arbitrary Python hooks, they are not in Python and are not Python packaging tools.
The build vs. install pinning issue isn’t solved by this PEP.
This is one of the PRs. I should also note, the failure here isn’t always bad too, as using the wrong order means you think you have all the static metadata, while you actually are missing some. If you don’t require the complete metadata, though, this might be about what you’d do after updating to support PEP 808, anyway (such as if you were building a dependency graph for GitHub’s display). It’s only a problem if you do need the complete metadata (for example, if you were validating the dependency graph doesn’t include certain packages). But yes, it’s a change, and now highlighted with a code snippet showing how to handle this. Also, PEP 621 did mention this could be relaxed in the future, so technically there was a slight indication that one order was better than the other.
We didn’t come up with a better name, I do want “dynamic” in the title, since it affects project.dynamic, and every variation we tried seems a bit convoluted having both “static” and “dynamic” in them. Open to suggestions if anyone doesn’t like the current name. Hopefully the second PR above helps emphasize that this is helping make dynamic metadata more static.
Or “Including/Adding static fields to dynamic project metadata”?
Could you please clarify how tools should handle adding pre-release version builds? For example, if the static requirement is numpy, and the build hook changes that to numpy >= 2.0rc1, this both restricts versions to NumPy >= v2.0rc1, but also allows pip to now select pre-release versions. Should this simply be not allowed?
I think there needs to be a comment in the PEP explicitly stating that adding environment markers to requirements are not allowed, because it doesn’t restrict the requirement, unless I’ve misunderstood “restrict”.
I’m just here to provide a meaningful use case. I haven’t discussed this with @henryiii, but I think this PEP perfectly fits the usage scenario of the pdm-build-locked backend plugin.
Specifically, with this PEP we can have the following metadata
After the build, it adds two additional optional groups locked and socks-locked containing pinned versions of all transitive dependencies. Before the PEP, we have to move the whole project.optional-dependencies to under tool table due to the constraints.
The PEP does not mention “restrict” (just grepped it), and it intentionally does not have any say about what happens when values are added to dependencies (or any other field). Originally, I wanted to allow the backend to do some simple combining on dependency fields, but that has been removed, and I had a custom behavior for license (since it is a mini-language that supports AND), but it was an explicit requirement that the PEP not special case any field at all, but only be based on data types. You can add items to lists or tables or tables of lists; that’s all the PEP allows.
Adding numpy >= 2.0rc1 does allow prereleases to match. The presence of numpy without prereleases in the static portion does not mean you can assume that prereleases won’t be allowed since more entries can be added and that’s something an entry can do. (That’s also true if you pass a flag to pip).
Adding a new dependency with an environment marker is fine in some combinations; ['numpy', 'numpy<2; python_version<"3.10"] is valid, for example. ['numpy', 'numpy; python_version<"3.10"] is not helpful, but that’s true today if you put them both in static or both in dynamic. I don’t think build backends do any validation to check for unhelpful combinations today.