Moving python-build to PyPA

To be 100% clear — I agree that this is a valid and important use case. The only problem is that it’s a niche one and we aren’t currently recommending any tool for the general case of “build these packages”, which is the common case. The problem isn’t that your tool is scoped too narrowly, it’s that it’s very easy to confuse with a tool that does the common case.

PEP 517 is pretty explicit that you should be building each package in its own separate isolated environment. It’s not required by the spec, but it is certainly a violation of the spirit. Even more important is that build-time dependencies are not install-time dependencies, so they should not continue to be installed afterwards. It’s even called out as a bad idea in the section of PEP 517 that I linked:

I believe that python-build has a dependency on the pep517 library anyway, so I recommend making the default mode to do exactly what pep517.build does, with the exception that if pyproject.toml is missing or if any of the fields in the build-system table is missing, it should be taken from the following defaults:

[build-system]
requires = ["setuptools>=40.8.0", "wheel"]
build-backend = "setuptools.build_meta:__legacy__"

I think pep517 itself offers some facility to do this automatically, possibly through envbuild (though it may not have the ability to modify the build-system table).

Well, this tool can be used as such, in fact in Arch Linux we would use it as such. We setup a separate isolated environment to build a package. We need a tool to perform the building, but not the environment setup.

I agree. As per PEP 517 it would be better for the default mode of the tool to install the dependencies in a virtual environment.

python-build already falls back to the PEP 517 recommended defaults.

I would rather use venv to have reusable virtual environments. What do you think?

I think we basically agree here, but that is my main contention, which is that when this is used in “package manager mode”, you aren’t just YOLO-ing it and installing all the dependencies into the global environment, you’re saying, “I will handle this myself”.

I think someone else will have to comment on exactly why pep517 didn’t choose venv, but I can think of a few objections:

  1. venv is only available in Python 3, and this tool seems to support Python 2 as well.
  2. In the vast majority of cases, you wouldn’t need or want or need reusable environments.
  3. A venv can be time-consuming to create (I assume pep517’s lightweight isolation is faster, but I can’t be sure).

Considering this thing is going to have a “package manager mode” anyway, the only value you’d get in using venv to create reusable build environments would be that the tool will read the requirements from pyproject.toml for you instead of having to manually get the list yourself. If this is a use case that a lot of people want, then it seems like a make-venv subcommand that creates a virtual environment for you would be prudent.

Another possibility that might be useful would be a subcommand that compiles a list of all build dependencies into a build-requirements.txt file or something. I feel like that might be a useful building block for automating the population of build dependencies in package managers, and also would make it pretty easy to construct a build virtualenv yourself (and this method has the advantage that you get to choose whether you want to use virtualenv, venv, tox, nox or something else to build the virtual environment). Obviously this is not mutually exclusive with a “make me a virtual environment subcommand”.

virtualenv is <500ms :smile: so virtualenv > venv, but would require additional time at installation.

Yeah. I just want to be clear, as when I re-read my reply I realize it isn’t. This is not a python virtual environment. This is a chroot/systemd-nspawn container with only the required dependencies. Basically a brand new arch installation with only the package dependencies installed.

Okay, that makes sense.

This is reasonable. I will use pep517.envbuild and maybe implement a make-venv command in the future.

That is also an option.

Hum… For now, I think we can use pep517.envbuild and consider this in the future if it makes sense.

1 Like

Nice to see this package. I need to look at it in more detail, but this seems exactly what we could use in Nixpkgs as well, where our tooling ensures dependencies are provided.

This package fits right next to the other proposed installation tool Creating a package to _just_ install a wheel

Be careful: getting build dependencies is a multi-step process. You first need to install the build-backend, and then run it to ask what else is needed.
In Fedora we were lucky that PEP 517 & co. came when the “dynamic BuildRequires” feature was fairly fresh, so we (Python maintainers) could push the multi-step feature into our build system. But if you do need a build-requirements.txt file before making the build environment (into which you don’t want to install anything during build), it gets complicated.

I was thinking this would be a building block more for a file that generates the, e.g. PKGBUILD file (in the Arch Linux case), so you could build a custom tool that uses virtualenvs and whatever tools you want to figure out all the direct dependencies of your package, then distro maintainers (or whoever) could write a utility to translate between pip-style requirements and Arch Linux requirements, so that someone could write a utility that would download something from pip and generate a first-pass PKGBUILD from it automatically.

Another of the high-importance items on the list from the 2019 packaging summit was building out a system for declaring system-level dependencies, with the idea that there would be a translation layer that distros could populate with a mapping between the “python way of declaring a dependency” and their individual way of doing so. Combine that with a tool that can extract a package’s dependencies, and it could take a decent amount of the grunt work out of going from “package on pip” to “package in ”, leaving package maintainers more time to do things like vetting the package.

That said, maybe this is veering a bit off topic. I think we’re agreed at this point that it’s YAGNI for the initial release.

OK, so with the change in scope of the project to be a more general end-user build tool, I’m now 100% enthusiastic about this again.

I think the only objection to this right now is the name, so once we’ve decided what it’s going to be called, should we just go ahead and say “motion passed” and move it in to the PyPA?

@FFY00 Are you also looking for additional maintainers for this package to reduce the bus factor?

Yes, I also have python-install. At the moment it is not clear if that functionality will be merged to installer. These two tools should make the PEP 517 workflow available for packagers without using pip, which requires bootstrapping dependencies if you do not vendor them.

I don’t think that is much interesting for arch. I am considering writing a tool to populate the PKGBUILD dependencies, but that can use a virtual environment as it would not be part of the packaging process (it would be a helper for packagers to run on their machine).

That is interesting. If you do it, stick to asking for specific files (eg. dynamic libraries: libfoo.so, PATH tools: bash), otherwise you will have a big problem with naming conventions (the FOO project can be called libfoo or foo depending on the systems/distros).

Anyone, feel free to send suggestions :smile:.

I have some, but I think they might be too common:

  • builder
  • packer
  • package-build
  • packager
  • pyassemble
  • buildr

buildr and pyassemble seem the most promising.

Yes, definitely. Anyone that wants to help is free to start collaborating in the upstream. I am open to chat, you can reach me by email (lains@archlinux.org) and IRC (FFY00 on freenode).

1 Like

I actually like the project/module name and would prefer to keep it as is. I wouldn’t expect the build directory to ever have an __init__.py file in it’s root, and the fact that it’s generally recommended to .gitignore the build directory means that python -m build is very likely to not conflict with some existing module in the user’s project.

We have discussed this a bit on the pull request and it seems like the main blocker is the “searchability” of the name:

The problem is not what to use for python -m , but that build as a directory name is way too common with established perception.

Personally I would go with a more unique name. python-build is also not a very good name since it is the pyenv command to build a Python installation, and I expect most Google results to point there due to pyenv’s popularity.

But since we are opening the scope and bringing more users, I think it should not be that much of an issue, the name would be more widespread as the userbase grows. I think pip install build is very easy to remember.

If we had a good name to replace it, I would just use it instead, but that is proving difficult. So, I think it is worth to maybe reconsider keeping the name.

We don’t have a clear direction from where to proceed here so things are tricky :stuck_out_tongue:. I feel myself being pulled from both directions.

As long as you’re happy to extend the scope to support automatically provisioning build dependencies I’m happy with going ahead with build; and accepting the project under PyPa. We can put the current behaviour under a feature flag (e.g. --isolation none, while default would be --isolation pip). :+1: I’d be happy to help out with maintainance too if that’s the case.

5 Likes

After getting to play with build a bit over the last few days, I’m basically 100% in agreement with this (and what @pganssle similarly said above).

1 Like

Indeed same here.

Another benefit of doing this is that it’ll let us remove the CLI from pep517 (the library) and to start pushing users toward this new tool for the CLI-based use cases.

A gentle nudge to see what the state of this is. IIUC, the --isolation flag was marked as a “blocking” feature request for moving this project into PyPA.

With PEP 609 accepted, we now have a proper process to accept this into PyPA when we want to go down that route. :slight_smile:

Thanks :grin:

The --isolation flag (well, actually --no-isolation, isolation is the default behavior now) is implemented in master, I’d like to do a release but there are a few things I want to get in place first. I am not sure whether it makes sense to propose now or after the release, I have been a bit busy so I don’t have a concrete ETA for it, I’d say maybe 1 month? Although, it might have to be a bit longer than that.

@pradyunsg the issue tracker still has a few release blockers in it, see https://github.com/FFY00/python-build/issues?q=is%3Aissue+is%3Aopen+sort%3Aupdated-desc

I would suggest that you fix the release blockers, ask formally for PyPA project acceptance, (hopefully) get acceptance, and then make the fresh release along with an announcement that it’s now a PyPA project. However, if there is something about PyPA membership that would logistically make it easier for you to fix the release blockers, then please say so and disregard my suggestion. :slight_smile:

2 Likes

Hi all, I would appreciate if someone could help maintain/test the project on Windows. I don’t use Windows and I end up spending a lot of time spinning up VMs and fighting the Windows workflows I am not used to :sweat_smile:, so if anyone could lend a hand it would be really appreciated :blush:.

1 Like