Support for build-and-run-time dependencies

I have a wishlist item for the PyPA: first-class support for build-and-run-time dependencies. I mean dependencies that are needed both while building a package and also while running that package. A typical example is a package using the C API of another package (either through Cython or a manually exposed C API like numpy).

There is too much an underlying assumption in pip and PEP 518 that every dependency is either a build tool or a run-time dependency. That may be true for the vast majority of Python projects, but it ignores the C API use case.

A solution would be to improve support for install_requires such that a package can use its install_requires dependencies at build time. That’s basically pip issue #6406.

The build-system from PEP 518 could truly refer to the build system. These build-and-run-time dependencies conceptually don’t belong there. These dependencies are typically only needed inside the setup() function, so the issue of importing them inside setup.py is not relevant.

I’m not expecting an immediate fix here, but it would help to put this somehow on the PyPA roadmap.

See also pip issue #6411.

2 Likes

What’s stopping you from specifying such dependencies under both pyproject.toml requires and install_requires?

1 Like

There is no really fundamental reason why not, but there are some issues:

  1. Those packages are possibly built twice which is annoying if the build takes a while (this is likely since we’re talking about C extensions here).

  2. There is no way to guarantee that the same version of the package is used at build-time and run-time. This may cause ABI problems.

  3. It goes against Don’t-Repeat-Yourself. Keeping track of the same dependencies in two unrelated places is not so clean.

  4. The “gut feeling” that these dependencies really shouldn’t be considered as part of the build system. The wording of PEP 518 is about build system/build tools but that doesn’t apply to these dependencies.

That’s a quality of implementation issue as I noted on the original ticket. It should be possible to fix this by better caching of wheels.

That’s a wheel tagging issue - if the build and the install are done on different machines, you’ll get a wheel for the runtime from your package index, and if the wheel tags say it’s valid, it should be. The implication of what you’re saying here is that your build system is creating incorrectly-tagged wheels.

Build time and runtime dependencies are fundamentally different things in the majority of cases, so while true, this is a relatively minor point IMO.

I don’t really know what you mean by this - it may be that there’s a certain class of tool (AIUI, this originated in relation to Cython) that doesn’t fit well in the build tools vs runtime libraries classification that PEP 518 is based on, but that needs to be made much more specific than just a “gut feeling”. But in that case, a new PEP clarifying what those tools are and how their needs should be satisfied would be the way forward, not just making some sort of combination of the existing 2 classifications.

2 Likes

Yeah, as long as the wheels are cached no extra build time should occur

I second this. I actually find it more descriptive to specify it twice.

The only way I can see you can run into issues is if the wheel tag is bad (not specific enough to quantify incompatibilities), but would that be the case you would run into some issues with install requires being pushed into build-requires.

1 Like

Typically, packages the use numpy’s C API have different build time and runtime dependencies. In particular, you can usually build against any version >= X, but then at install time you need any version >= [the version you built against].

Yep, but surely this is just the whole known issue where the existing wheel tags aren’t sufficient for stuff that’s built against numpy? So this isn’t a new problem, just the same one we’ve had for a while.

1 Like

I wonder whether there’s any value in a wheel tag that essentially means “only valid on precisely this machine, right now” - effectively enabling wheel caching without needing to provide any compatibility guarantees?

I don’t think it has anything to do with wheel tags. It’s straightforward to express the relationships with existing metadata. It’s just that a package like scipy will typically build-require numpy >= X, and install-require numpy >= Y, where X and Y might be different.

If we combined these together into a single build-and-install-require, like this thread proposes, then you’d lose the ability to express this accurately.

Oh, I see. Yes, that wouldn’t work under this proposal.

I’m really talking about “runtime libraries” that happen to be needed also at build time. numpy is actually a good example of such a library: it has its own C API. If another package wants to use the numpy C API, then numpy needs to be installed when building that other package.

But I would never call numpy a “build tool” (of course, what’s in a name).

Hmm, OK. But why keep building from source? Just build a wheel once and you’re done, surely? (And yes, this is the quality of implementation point that pip should reuse the wheels, but you can of course build the wheel manually and make it available to pip before doing your project build…)

I’m still not sure I see why this is such a big issue.

I never claimed that it was a big issue (read my wording in earlier posts here), but it will become a bigger issue if pyproject.toml ever becomes mandatory. So it would be nice if the PyPA could at least somehow acknowledge this use case and improve support for it.

Indeed, this is also part of the problem. Wheel tags are good for the Python ABI but not for the ABI of packages.

Pip already caches the wheel the first time and then reuses it, doesn’t it?

1 Like

The essence of the original bug report seems to be that no, it doesn’t. I haven’t checked that myself but I’m happy to assume it’s true - in which case it;'s “just” a bug that needs fixing. I still feel I’m missing something over why this needed a discussion here rather than simply someone to fix that bug…

Sorry, all I meant by “big issue” was “needs to be a discussion here rather than an acknowledgement on the pip tracker that this is a bug that should just be fixed” (which I confirmed here).

1 Like

The way I see it, it’s not one specific bug: it’s a bunch of design decisions that were made with the new pyproject.toml based installations which made this use case harder and more prone to breakage than before.

I guess that such build-and-run-time dependencies were never officially supported (but they just happened to work), so it wasn’t considered when designing PEP 517/PEP 518.

I still think build and runtime dependencies should be kept separate. And consider it a feature, rather than bug.

There is a connection with PEP 517 and projects that can't install via wheels too.

The thing breaking install_requires is actually the install-by-wheel feature of pip (not pyproject.toml as I initially thought): older versions of pip used to install packages in the correct order as specified by install_requires (this is actually documented).

Newer versions that install by wheels no longer do this: wheels are built in a random order and then installed. (this doesn’t break anything for the moment because pip still falls back to the classic installation if building a wheel fails)

I would say, it was design decisions taken to reduce the chance of it working by accident, in order to increase to the chances of it working on purpose.

“You can’t install this package unless you already have numpy installed” was definitely a case that we had in mind and wanted to get rid of.

This isn’t really workable. It breaks circular dependencies, and packages that can coexist at runtime often have conflicting build-requirements.

I’m sorry if I’m missing the point of what you’re saying. But so far I’m still not seeing any design problems.

1 Like