PEP 517 and projects that can't install via wheels

Or by supplying their own scripts entry? But either way, I don’t think this qualifies as a reason to avoid wheel builds.

It’s certainly not a reason to avoid wheel builds. I mention it just so we’re aware that the change will probably break someone’s workflow. :wink:

I probably have a workflow like that, vext is a package that installs an import hook to that works as a gatekeeper - allowing certain libraries to break out of virtualenv.

Usually I configure vext to use libraries that were created before virtualenv and don’t play well with it.

Hi Stuart!

It’s not immediately obvious to me why that wouldn’t work when installed via building a wheel, but I haven’t thought much about how it would work. Can you expand a bit on why building wheels as an intermediate might be a problem?

I learned of some behavior related to this today.

Currently, if someone uses pip with either --global-option or --install-option, then pip will disable using wheels. This is described in pip’s documentation here: This page has moved - pip documentation v23.3.2

Here are three pip issues about a bug in this feature (I diagnosed the bug today in the first of these issues):

To be clear, though, that’s an issue with pip’s UI, not a fundamental reason why the relevant projects can’t use wheels.

In the case of pip options like --global-option, --install-option, and --build-option, what would be the changes needed to support using wheels in those cases?

Is this the same as the issue of passing the generic config_settings dictionary to the build backend that was also discussed in pip issue #6304 (" pip not naming abi3 wheels correctly") by e.g. @njs and you in this comment and this comment, respectively?

In the latter comment, it was suggested it might require tweaks to PEP 517 (in addition to getting agreement from setuptools on the format and meaning of the dict).

Basically, there’s no defined pip interface to PEP 517’s config_options. We’d need to define one. It’s quite possible that would be as simple as saying that --global-option etc pass stuff directly on, but I don’t know how setuptools treats config_options, so that might be a backward compatibility problem.

PEP 517 changes would only be needed if we wanted to somehow define standard config settings that all tools would agree on. But I don’t honestly know what sort of things people use --global-option for, so I can’t really judge that.

1 Like

Of the 3 issues @cjerdonek linked to, it looks like one is using these flags to pass build options, which would have to be passed through the PEP 517 interface:

  • --global-option="build" --global-option="--fcompiler=nag"

and the others are using it to pass install destination options:

  • --install-option="--prefix=/tmp/test-install"
  • --install-option="--install-scripts=<directory>"

Pip already has a --prefix option, so it shouldn’t be necessary to pass a prefix through --install-option. To resolve the second one, I think pip would need a set of new options to override the install location of each component (scripts, purelib, platlib, etc.).

2 Likes

And in a PEP 517 world where we build via wheels, that one wouldn’t need or want to be passed to the build backend at all.

pip install and setup.py bdist_wheel treat install_requires differently. See the discussion at Install install_requires before running python setup.py bdist_wheel · Issue #6193 · pypa/pip · GitHub and Support for build-and-run-time dependencies

The referenced discussions involve pip, which is adding complexity to the description that it would be helpful to avoid here. Can you provide a demonstration of the problem just using setup.py install and setup.py bdist_wheel as requested - ideally with a specific published package rather than with an artificial example (although a simplified example demonstrating the specifics of the issue as well would be helpful).

I believe the difference @jdemeyer is referring to is: if you run setup.py install, then setuptools will check if all the install_requires are already installed, and if not it will easy_install them. When pip runs setup.py install, it wants to avoid this, so instead it first runs setup.py egg_info, collects the list of install_requires, installs them into the target environment, and then runs setup.py install in the target environment.

Either way, setup.py install can be confident that it’s running in an environment where all the install_requires are already installed. setup.py bdist_wheel OTOH does not do any of this.

So if you have a package that (1) lists a dependency in install_requires, (2) doesn’t list it in setup_requires or pyproject.toml, (3) tries to import that dependency from inside a custom setup.py build step, then that will accidentally (!) work with setup.py install, but not with setup.py bdist_wheel.

I think in this case, using bdist_wheel is just revealing a latent bug in the package, and we already have a solution: we tell people to declare their setup requirements properly. But it is a case where switching from setup.py install to bdist_wheel could make something stop working.

(Note: I might have some details of what setuptools does wrong. For example, I’m not sure whether setup.py install will install any missing install_requires before or after it runs the build step. If it’s after it runs the build step, then the hypothetical package I described above will work when pip does setup.py install, but not necessarily when anyone else does setup.py install.)

2 Likes

Thanks - I see. But I guess we’re just going round in circles in that case, as we’ve already discussed this “issue” multiple times, and the solution is still for packages to declare their requirements correctly.

No, because these packages rely on the specific pip feature This page has moved - pip documentation v23.3.2

Why do you keep saying that? The pip feature that these packages rely on is a feature that was intentionally added and documented (Issue #2478: Topological installation order by rbtcollins · Pull Request #2616 · pypa/pip · GitHub). Maybe that was a bad idea in retrospect and this feature should be deprecated, but you cannot shift all blame on the packages here.

I don’t think that link is strong support for your case, because it says this:

Although the new install order is not intended to replace (and does not replace) the use of setup_requires to declare build dependencies, it may help certain projects install from sdist (that might previously fail) that fit the following profile:

This is the bit you’re depending on, and it doesn’t read like a guarantee to me.

There’s a larger shift which has been underway for the last few years to disentangle build and installation steps. You seem to be relying on the two being tightly connected. So at some point, either the entire ecosystem changes direction, or your use case will break. I think people are willing to help figure out what can be done instead, but insisting that install_requires must be considered build dependencies doesn’t seem useful.

If you read the PR you linked to, the change was almost rejected on the grounds that it could encourage people to depend on install_requires at build time, instead of using setup_requires. The only reason the PR was merged was because it had other unrelated benefits that were considered valuable enough to override the costs of mixing up setup_requires and install_requires.

We could argue about whether we call this a “bug” in the packages, or “an unfortunate choice the package maintainers made that has become incompatible with the larger ecosystem”. But it doesn’t really matter – either way the packages have to change. You’re getting zero traction on convincing people to treat install_requires as setup_requires, and you’ve been persistently ignoring all attempts to explain why that is, refusing to consider alternative approaches, and now you’re trying to drag other threads into debating your pet topic.

This is not a productive way to engage. Please find another way.

It seems to be hard to keep this from being confrontational, but how about we take this view. Whether that section of the documentation was a hard guarantee or not, isn’t that important now (and I don’t want to start assigning blame on that score). But yes, it is not going to be supported going forward, and PEP 517 in particular does not support that usage. Until PEP 517 becomes mandatory, the workaround is to use --no-use-pep517, but projects should be looking to modify their setup.py to correctly declare their dependencies as soon as possible.

A documentation PR explaining that would be welcome, but given all of the strong opinions being expressed in this discussion, I fear that it will be hard to find anyone who can impartially state the position in a way that will satisfy all parties :frowning:

My take on suitable documentation wording would be:

Note: Projects relying on the install order to avoid declaring dependencies which are required at both build and runtime in two places will no longer be able to do so under PEP 517. They should modify their setup.py to declare the build and runtime dependencies fully and independently. As a short term workaround, they can use the --no-use-pep517 flag to force the previous behaviour, but that flag will not be available indefinitely.

Assuming there’s no objection to that wording, I’m willing to make that change, if that will finally put this extended debate to rest.

6 Likes

But that’s the problem: this isn’t quite possible today. And that’s the whole point of the topic Support for build-and-run-time dependencies

I never said that packages should not be changed. I also never said that the behaviour of install_requires should be kept. I never refused to consider alternative approaches. Please don’t put words in my mouth that I didn’t say.