Support for build-and-run-time dependencies

I would consider that a bug in that package and not with pip. That’s a use case that install_requires used to support just fine.

For most build system outside of Python, build-and-run-time dependencies are the most normal thing in the world, so it’s strange that Python does not support this.

CPython itself is an example of such a dependency: nobody expects that you can build a package with some CPython and then at runtime use a different unrelated CPython. Yet, by separating build-and-run-time dependencies into a build-time dependency and an unrelated run-time dependency, that’s exactly what you’re doing.

For the packages that I had in mind, circular dependencies are not an issue. The old install_requires algorithm for pip worked fine for me.

That’s not relevant to this discussion: I still want to support build-system from PEP 518. We’re not talking about build requirements here: we’re really talking about run-time requirements that happen to be needed already at build-time. That’s the correct way of thinking about it.

I guess at the end of the day, all you really want is some shorthand way to give a single requirement and have your build system automatically use it both at build time and at install time. That’s not something I’ve ever personally found useful, but I can believe that for a certain niche of packages it might be a convenient shorthand. Fortunately, it’s totally doable today – build systems are allowed and encouraged to expose whatever developer experience they feel like. For example, they could define a build-and-install-requires key in their configuration, if they think that’s useful to their users. So I suggest you take this up with whichever build system you’re using, or look for a new system.

There’s really nothing the core packaging standards can do to support this beyond that, since you’re fundamentally asking for a shorthand to put the same piece of data in two places (sdists and wheels), and we just specify how those files are interpreted; we don’t control the processes that generate them.

1 Like

It’s already reported as a pip bug in https://github.com/pypa/pip/issues/6193 (and also https://github.com/pypa/pip/issues/6406 but there it’s worded less clearly).

Yes, and those bugs have been rejected by the pip team, and I agree with them.

When I say “you should take this up with whatever build system you’re using”, I mean, like, setuptools or scikit-build or whatever. I’m not personally convinced that this is a good feature for setuptools or whoever to add, but if anyone’s going to do it then they’re the appropriate place.

Look, the reason for opening a discussion here is precisely because it’s not clear who’s “fault” it is and what the right way is to support such dependencies. It’s a documented pip feature which no longer works with pyproject.toml, so it’s not crazy to call it a pip bug. But we’re not getting anywhere if you don’t even want to acknowledge the problem.

It seems to me that Nathaniel and Paul do acknowledge the problem that pip needs to build and install two separate copies of the same library. The proposed solution, however, is to ensure that pip can reuse the same built copy during dependant build and install, so there wouldn’t be a potential ABI mismatch during build and run time.

I feel there is a disconnect this this thread about what the actual “problem” is. You have been insisting that the problem is that pip does not support build-and-run-time dependencies, but others have point out that it’s not in itself a problem, but a solution to a more fundamental problem. Efforts are put into finding the best solution to that fundamental problem, and it’s not helpful that you insist yours is the only correct solution, without explaining why others are not.

2 Likes

If you replace “pip can” by “pip must”, then I agree 100%. If the reusing is optional, then there is no difference with the status-quo.

1 Like

Most importantly, the ABI versioning issue which was discussed already in several posts above. That one can potentially break the build.

That could be solved by making sure that the version of the dependency which is used at build-time is exactly the same as the one which is used at run-time.

In the context of setuptools, can you not use setup_requires to specify build time dependencies? In that case, it should be trivial to define a list of build-and-run-time dependencies which you add to both setup_requires and install_requires inside your setup script.

If that does work, I don’t understand what the problem is. If it doesn’t, it seems like a fairly straightforward issue with setuptools: it should have some way of expressing build dependencies.

PEP 517 defines an interface for a build backend (such as setuptools) to tell a frontend (such as pip) about build dependencies. The metadata in the built wheel then specifies install dependencies. I don’t see any argument for combining this information at the level of the standardised interface, but build backends are free to provide a way to specify a build-and-install dependency and expose it in both places.

You already have a way to express this, put == in your pyproject.toml and then the same in your install_requires. My puzzlement still goes from why this is not enough? And what alternative solution are you referring to? I think other people in this thread uniformly agree separating build and runtime dependencies is good. Granted in some case you want to keep them in sync, but then we’re talking about how do you ensure a build time version to be satisfied at runtime.

I think Nathaniel’s idea for this case was that the build process should create an install requirement on the version it was built with. I’m not aware of any build backend so far that tries to handle this, but since setuptools runs arbitrary code, it should be possible to do it in a setup.py script, something like this:

setup_requires = ['numpy']

if 'bdist_wheel' in sys.argv:
    # Require the same numpy version used when building
    import numpy
    install_requires = ['numpy=={}'.format(numpy.__version__)]

I’ve ignored all corner cases for the sake of illustration, but I think that basic skeleton is the right approach.

3 Likes

I thought that setup_requires was effectively deprecated with PEP 518.

But apart from that, your approach could indeed work.

I think setup requires is there more as a backup for old backends. In practice you would specify the exact build dependency in pyproject.toml, and the install requires would be generated as @takluyver showed above. Backend builders, such as setuptools could have a way to auto inject some dependencies as pointed above.

Doing such exact install requires may turn out to be troublesome. Imagine pandas hardcodes 1.18, scipy hardcodes 1.17… Doing a pip install pandas scipy may easily lead to broken environments unless we have a dependency solver. At the moment pip effectively makes first occurred version wins :slight_smile:

Part of this problem could be mitigated if a package defines an ABI versioning scheme. For example, all 1.x versions could be ABI-compatible.

Based on this, I’d like to make a bold suggestion: have a way for a build dependency to add run-time dependencies. This would default to the empty set (no run-time dependencies). But packages could specify “if I’m used as part of a PEP 518 build-system, then XXX should be added to install_requires

I just tried it and the caching does indeed seem to work in my use case.

I tried this in combination with pyproject.toml (without setup_requires):

from pkg_resources import get_distribution
setup(...,
      install_requires=['cysignals=={}'.format(get_distribution("cysignals").version)],
      ...)

But I would prefer something like

import cysignals
setup(...,
      install_requires=cysignals.__requires__(),
      ...)

Even better would be if this __requires__() would be automatically added by the build system.

That’s something for your build backend system to do, e.g. setuptools could have a install_requires_from_build=['cysignals'] option, this would append whatever is used for build into install requires.

Build backends can already do this. The API points of PEP-517/518 does not need to be changed for this. It’s just a matter of a build backend to provide an interface to easily do this. As such what you’re asking sounds to me like a setuptools/poetry/other build backend feature, rather than something that needs to be agreed and standardised.