PEP 517: how to pick compatible build dependencies?

Say we have a package, like h5py, which builds against numpy. It can support a range of numpy versions but, once built, requires numpy at runtime to be the same version that it was built against (more correctly, a binary compatible version - I don’t know how broad that is). With PEP 517 & 518, the sdist would specify a build dependency like numpy>=1.12, and then the build process would output a dependency like numpy==1.17.2 based on the numpy version in the build environment,

However, now we want to install h5py with an older version of numpy (e.g. to test compatibility). We create an environment with numpy 1.15.4 (for instance), and then try to install h5py from source. Without build isolation, h5py builds against the numpy version in the target environment. With build isolation, pip creates a build environment with the newest version of numpy compatible with the build requirement. The wheel depends on a newer numpy, and pip either has to report that it’s incompatible, or upgrade numpy in the target environment - which is not what we want.

Clearly we can work around this by avoiding build isolation for now. But build isolation is a good thing overall, and I expect it to gradually get harder to avoid.

One idea that springs to mind is for pip to create the build environment with the same numpy version as the target environment (assuming this meets the build dependencies), rather than the newest version. This would make this use case ‘just work’ again. But it feels like a bit of a hack - some of our environment state leaks into the build environment.

If we had infinite time, we could build wheels with every possible combination of build dependencies, and then select one that needed the fewest upgrades/downgrades of other packages in the target environments. Clearly we’re not going to do that, but maybe it helps think about other solutions.

Looping back to the discussion in Support for build and run time dependencies, maybe build backends need some way to inspect the target environment to choose build dependencies. I don’t think PEP 517 backends currently have any way of knowing about the target environment.


Installing in whatever environment is bound to cause issues as you have to deal with state. Declaring your target environment is the preferred way in my opinion. Therefore, should we put more effort into mutating environments and, in this case, requiring ways to inspect the environment?

In the projects in the scipy/pydata stack that I am familiar with, they pin numpy in the pyproject.toml file to numpy==1.12 instead of numpy>=1.12 to avoid what you describe. In that way, in the build isolation, you always use the oldest numpy version supported. See eg

Thanks @jorisvandenbossche - it’s good to know that that building on an older version should work for numpy.

We maybe still need ways to handle this for dependencies that aren’t as careful with compatibility as numpy, but that may be less urgent.

The problem with pinning to the oldest is when the oldest is not compatible with the version of Python you’re using (not a hypothetical - numpy was broken on 3.8 until it was fixed), which means you may have a patched or VCS version installed out of necessity, but somehow you also need the build environment to get it too to avoid failure.

The best suggestion that works now (other than disabling isolation) is to create your own wheel of the working version and make sure it’s the only available one when building the package you actually care about. I don’t think this is very discoverable, but I also don’t have a suggestion that preserves isolation.

For that, in scipy, they pin numpy to a different version for each python version ( But of course, that is not necessarily future proof for upcoming python releases, as is the case you mention (given what you are saying, scipy should probably add a line for python 3.8 requiring the latest numpy, if that is already released)

I think it would be helpful if pip have users a way to manually override the build dependencies declared in the package. So like if there’s a specific version of numpy you want, you could write

pip install h5py --force-build-require="numpy == 1.5.2"

I’ll leave it to others who need a feature like this to discuss its merits, but if we do want something like this, it should probably be an update to PEP 518 - “build frontends SHOULD provide a means to manually override the build dependencies specified in a package at runtime” - which pip could then follow.

Also, from a UI perspective, pip install h5py --force-build-require="numpy == 1.5.2" seems fine, but what if you have multiple targets in the one command?

pip install h5py scipy requests --force-build-require="numpy == 1.5.2"

Yes, “obviously” people shouldn’t do that, but the design still needs to define what happens if they do.

Overriding the path to the build environment (to one that’s already partly configured) might be better, assuming that sometimes you need to override the dependency to something that hasn’t been released.

PEP 517 does contain a Recommendations for build frontends (non-normative) section, which states:

… Therefore, build frontends SHOULD provide some mechanism for users to override the above defaults. For example, a build frontend could have a --build-with-system-site-packages option that causes the --system-site-packages option to be passed to virtualenv-or-equivalent when creating build environments, or a --build-requirements-override=my-requirements.txt option that overrides the project’s normal build-requirements.



We also have to implement config_settings for PEP 517.

And both of these have to be done in a manner wherein we can let folks use custom flags on a per-package.

Oh cool! Please put the keys to the time machine back now that you’ve finished with them :slightly_smiling_face:

Getting the UI for the functionality right is still going to be a bit of a challenge, but good to know there’s no need to have a PEP revision cycle as well.