Pinning build dependencies

I initially created a thread about this at How to pin build dependencies but I think this may be a more productive place for it.

I wrote more about my particular problem in that thread, but to try and summarize, I would like a way to track and pin the transitive build dependencies of my project. We pin all of our runtime dependencies, but there have been multiple times in the past few months where automated builds of my project fail because a build dependency somewhere put out a bad release. There doesn’t seem to be a good way to track and pin build dependencies with Python packaging tools though.

I’m curious if others think this is a problem, can offer any advice, or if there’s any interest or progress in making pinning build dependencies possible with Python packaging tools.

Right now I think the best way to do it is to bypass build isolation (which in your case is fine, because it’s CI, but you might manually set up a disposable build environment) and pre-install your dependencies.

Ultimately, pip makes a best effort to do things automatically, but it’s still a tool that needs expert hands to use reliably. If it’s breaking, time for you to become the expert :slight_smile:

(BUT, I’m not a pip maintainer. So if the pip team wants to be able to handle this case - which I’d characterise as “consistent build environment for all packages being installed together” - I’m sure they’ll come up with a plan to support it.)

Hello there, I’m working on a downstream repository. Each repo release will ensure a single version for a project for each platform, and one can use it to replace PyPI for both build and runtime dependencies for reproducibility. It has barely any packages though and is in really an early stage of development, so this is rather a invitation for collaboration than a proper advertisement (-;


That would allow us to specify versions of our build dependencies which pip doesn’t easily allow with build isolation since flags like --constraints are not respected, however, my project already has a workaround for that in using the PIP_CONSTRAINTS environment variable which does affect isolated builds.

It’d still be on us to manually track down all of our transitive build dependencies though right? I think that’s the piece I’d personally like tooling to help with that I’ve never seen before which surprises me as I think others would benefit from stability and consistency here.

1 Like

Yes, though that’s just normal dependency resolution. So pip install will do it, or pip-tools can calculate it without actually running the install (though I’m not sure if it uses the same resolver that pip has now).

I’m not sure any of those tools help.

If I disable build isolation, pip install no longer installs build dependencies. If I leave build isolation enabled, I think build dependencies are installed in an ephemeral environment and discarded. None of pip-tools, poetry, or pipenv seem to include build dependencies in the generated requirements/lock file.

For setting up your build environment manually, you will need to install packages. You can use pip for that, provided none of them need to be built. If they do need to be built, you simply recurse and repeat (or let pip handle those - which should be comparatively simple builds - in isolated environments).

But once you know that some package requires, say, flit to build, then pip install flit; pip freeze is going to show you exactly what dependencies flit requires. Then you can put them into your build environment for when you do the real build.

Right. Thanks for the detailed explanation for how to track down and install build dependencies manually, but it’s the need for all this manual effort to pin build dependencies that I’d ideally like to change. When all components and extras of my project are installed, we have 175 dependencies and manually tracking down all build dependencies across all systems is a fair bit of work that I think tooling should ideally be able to help with.

For the Python packaging community here, is there any interest or progress in making it easier to pin build dependencies? If not, is the thinking that this isn’t really necessary and occasional failures at install time when wheels aren’t available is acceptable?

Yeah, that is a big ask.

Personally, I think being able to ensure a consistent build environment across all your packages is really important, and the only way to make sure they work together. But I don’t think any effort has been put towards that - most effort is looking at producing a single package’s wheels to put on PyPI. Private whole-environment rebuilds aren’t really covered right now.

Perhaps when the new build tool is closer to being ready and you don’t have to work around pip, it’ll be a good starting point for a new tool that can do consistent builds.


FWIW, GitHub - pypa/build: A simple, correct PEP 517 package builder is ready for use. It’s low level plumbing though, so it’s quite likely that you’d have to build something wrapping it to get what you want here.

For the Python packaging community here, is there any interest or progress in making it easier to pin build dependencies? If not, is the thinking that this isn’t really necessary and occasional failures at install time when wheels aren’t available is acceptable?

There is a desire to improve this, but there isn’t enough bandwidth among the volunteers who write/maintain the tooling to tackle this. As Steve noted, it’s a fairly big [t]ask, since there’s a lot of nuances there, and it’s not clear what sorts of knobs are needed.


Thanks a lot for the info. I figured that’s where things were more or less at, but since I haven’t personally seen any significant discussion about this, I thought I might be missing something.

I understand it’s a hard problem. I don’t think I have the resources to be the primary driver on a solution here, but I’ll keep an eye out for any work on this and contribute to it if I can.