PEP517 build isolation on closed networks

I work on closed networks that have no internet connection. Approved Python packages are typically given to me as sdists downloaded from Pypi. One such package is PEP517 compliant. When I attempt to install that package, pip attempts to build the package in isolation by downloading the requirements, which is not possible, so pip fails. What’s surprising to me is that being unable to achieve an isolated build environment is considered a failure and the package is not installed.

The build requirements for this package are already satisfied, and I can currently circumvent the issue using --no-build-isolation or --no-use-pep517. Are they really the recommended solution? Is there some argument against pip using existing installations of required packages if it fails to obtain them for an isolated build?

I’m proposing that pip continue to install PEP517 compliant packages, even if build isolation was not possible.

  1. Build isolation is not required according to the language in the PEP.
  2. Being able to obtain newer packages for build isolation shouldn’t be an issue for 99.9% of people, so I don’t think this violates the spirit of the PEP.

While I understand the usefulness of the feature, my feeling is the behaviour should not be the default, since there are simply too many ways to mess up a build environment if it’s not entirely separated from the actual site-packages, and those problems are difficult to debug for less experienced used. OTOH, advanced users that can debug build environment corruptions are also in also a good place to figure out the subtle differences between the two environments. So I feel pip’s current behaviour is more reasonable as a general tool—keep inexperienced users safe by default, but provide a way to turn it off (--no-build-isolation) for advanced users who know what they’re doing.

Also note that there are proposals to add ways to more easily build a build requirement “wheelhouse” for PEP 517 installations without internet connection, so you can easily put those build dependency wheels in a directory, and run pip install path/to/sdist --no-index --find-links=<directory> to perform PEP 517 installations without accessing the Internet. I searched but couldn’t find the issue on GitHub right now, unfortunately.

1 Like

That’s overly cautious. The cross section of people who

  • don’t have internet connections and can’t obtain packages
  • have screwed up environments that would actually cause problems
  • are inexperienced and apparently unable to read a big banner that says “unable to achieve build isolation, you may experience issues, please see the documentation”

is incredibly small.

On the other hand, with your opinion you are suggesting that all people who don’t have internet connections (not just the additional cross section listed above) and can’t obtain packages have to be experts to resolve their issue. When pip fails to install requirements for build isolation it prints some “failed to obtain” messages and quits. There is absolutely nothing to help the user understand why their package failed to install, what PEP517 even is, and that --no-build-isolation might help in their situation. In fact when I was first confronted with this I had to do research to figure out what was causing the issue. I even asked several more knowledgeable people who took some time to figure out what the issue was and how to work around it.

There is also the issue of automation tools like tox. To be able to have tox work with PEP517 packages in closed environments tox will now need to add special support for --no-build-isolation. With the suggested change no special support is needed (well, perhaps tox would want to pipe through the “failed to obtain build isolation” message). And this goes for all automation tools; they would all have to do additional work to support this case, when it could otherwise be avoided.

(Perhaps tox is a bad example since it already tries to provide isolation, but I mean what you know.)

I’m fine with passing --no-build-isolation, but all my experience has told me it’s a bad idea. And the trade-off is a net-negative for usability IMO.

If you’re doing the work to manually figure out the set of build dependencies for each package and make sure that they’re already available in the environment before each install, would it be significantly harder to point pip at a directory with the wheels for all your build dependencies?

I know it would require a change in what you’re doing, and changing things is costly, but build isolation is kind of important for reliably building packages, and in some cases it’s actually impossible to build packages correctly without it (e.g. if you have two packages with conflicting build requirements).

What you are saying is that pip was designed around a central package store and what I’m trying to do is not really idiomatic. So instead of installing dependencies piecemeal I should instead create a package store within the closed network?

We have thought about doing such a thing for a while, it would be a lot better than what we have now, but we aren’t sure how to go about achieving it. Is there any documentation on the subject? Does pip even support that? We aren’t allowed to run servers, so it have to be a directory structure on a network drive.

This is certainly a situation pip is intended to support, but the assumption is that if PyPI isn’t available, you have to tell pip where it should look to find packages instead. (The term pip uses is an “index” rather than a “package store”)

There are options --extra-index-url and –index-urlwhich can be used to specify additional indexes, or replace PyPI as the default index. There is also a–find-links` option that lets you specify a simple directory full of files, without needing to set up an actual index. These are all covered in the pip documentation (although if you’re not familiar with pip’s terminology, it’s possible that it may take some work to follow - feel free to ask if there are any parts of the documentation that you have problems with).

If you want to set up a local index, there are projects such as devpi which do this for you - or you can implement your own relatively easily (the index protocol is designed to be little more than what a webserver serving a static directory will give by default). But a network-shared directory referenced via --find-links is a completely viable solution needing no extra software. (It doesn’t even need to be network-accessible if you’re only using pip on a single machine).

Also, you only need to store the packages you need in the repository. As @njs pointed out, you already know what packages you need installed to create your build environment, so you can just add those to your local index and pip should be happy with that.

1 Like