Relaxing (or clarifying?) PEP 621 requirements regarding dynamic dependencies

Okay, fair enough. That’s not what I was getting at though, so let me rephrase: at any point in time, the current state of the main branch of a project has dependencies (as does every other branch). The existence of dependencies is not limited to release artifacts. For your example, Django, it looks like they’re in its setup.cfg and its pyproject.toml.

So to rephrase my two points slightly, we have:

  1. Dependencies of the current state of the source code of a project,
  2. Dependencies specifically for building wheels for redistribution on PyPI, and during an isolated build.

pyproject.toml lives in VCS. It’s one file, and it must express dependencies for two different things. To stay with the numpy example from @FFY00’s first post in this thread: for (1) it’d be perfectly valid for a user to build scipy against any supported numpy version, such as 1.24.1. The pyproject.toml content seems to be saying otherwise though, it contains ==1.21.6 pins, because the choice made by the project is to express the dependencies for (2), i.e. they’re set to the values needed for a wheel build for distribution on PyPI.

I’m not sure I agree, or that this was a conscious decision when pyproject.toml was introduced. The language in PEP 518 to support your point of view here seems to be missing. E.g., it starts “This PEP specifies how Python software packages should specify what build dependencies they have”. It does not say “build dependencies to create wheels from an sdist”. So I suspect it was left in the middle, and we have different interpretations.

Either way, we’re in a pretty unhealthy state. It’d be much better if pyproject.toml captured the actual dependencies for the code base it’s included in, separately from wheel-specific constraints. I want to be able to express “foo depends on bar>=1.2.3” (independent of how bar was installed). That’s the more interesting info for a wider audience imho - it determines what features from bar contributors are able to use in the code base, and what the metadata for a binary artifact of foo derived from VCS tag or sdist in any packaging system that wants to include a package for foo should be. Those seem like things worth capturing in metadata.

Agreed. We’re going to do that now in meson-python.

I don’t think it blurs the lines too much, and it is important. I will note that:

  • There are many, many packages that technically cannot express metadata as static at all right now. This includes many of the most popular packages on PyPI for the PyData stack: SciPy, scikit-learn, scikit-image, Matplotlib, statsmodels, most users of Cython, etc.
  • The flexibility is limited here, it’s not like all these projects have fully dynamic metadata. They are only adding one or more constraints, so the wheel dependencies are a strict subset of the sdist ones.
  • This is not a niche thing, it applies anytime one uses a C/C++ API. In fact, this “dependency narrowing” is so important that for CPython it has been encoded in wheel filename metadata. In an sdist we have (a) pyproject.toml, with requires-python = ">=3.8" and (b) PKG-INFO, with Requires-Python: >=3.8. A corresponding wheel can have different metadata, as soon as you use the CPython C API the >=3.8 transforms to a specific minor version like -cp310.
  • If we leave it to tool-specific settings, any other users of pyproject.toml will be unable to support that. E.g., goodbye to the GitHub Dependency graph reporting those dependencies.
  • This post by @steve.dower identified “allow (encourage) wheels with binaries to have tighter dependencies than their sdists” as one of four key things to do to improve interoperability with Conda.

I hope the above is enough to convince you that this is a bad thing, and important to address. The work for that needs to be done through a PEP, but I hope we can agree that it’d be beneficial to put effort into that.

1 Like