Hi, I wanted to gather some opinions about potentially moving python-build to PyPA.
python-build offers a simple CLI to build distribution packages as defined in PEP517.
It is only a build tool, it does not do any sort of package management. We are moving in a direction where pip is becoming an “one tool does it all”, this is not compatible with several workflows, one of the main being Linux distributions.
Previewing some comments: this is different from python -m pep517.build, pep517.build will invoke pip to resolve dependencies.
p.s. I suggest naming the module something other than build. This directory name is by customary used by a lot of build systems to hold intermediate build files, and is excluded by default by a lot of tools, such as the recommended Python gitignore file from GitHub. This can cause a ton of configuration headaches and unnecessary debugging time.
Okay, there will only be issues when build/__init__.py exists. I would argue that this is not common. As so, I have two proposals:
Rename the build module, as is, to python_build and add a build/__main__.py that just calls the python_build main. This way the users affected can still build using python -m python_build, but keeping the rest able to use python -m build
Rename the module to something like pybuild, although I liked build more .
Proposal 2) is more conservative and should avoid any issues, proposal 1) provides a better general UX, in my opinion, but still leaves the potential of having issues, while providing a viable way to work around them. My main concern about 1) is the debugging headache, but that should be mitigated by documenting it, no?
With that said, I like 1) but 2) might be better. I can’t make up my mind about which one to adopt, I am not entirely sure if the tradeoffs are worth it.
Welcome @FFY00 and thank you for your request! Personally I think this is a fine idea for a project within PyPA. I filed an issue to ask you to add the PyPA Code of Conduct to the project. I think once you resolve that plus the naming issue I would be happy having this project into the PyPA, but I defer to others’ judgment in case there’s some other barrier I haven’t noticed.
As of right now I believe the GitHub administrators for the PyPA org are the ones who would decide whether to add a project to the PyPA – see PEP 609: PyPA Governance for possible changes to that procedure.
I’m personally ok with this becoming a PyPa project, however I have also naming/confusion concerns. The build feels just a bit to general, even pybuild or python_build. You can build a lot of things and this certainly does not do all of them. Perhaps pep517_no_deps is a better name considering what it does. PEP-517 makes installing the build dependencies also a job for the build frontend, so in that sense feels like a subset of PEP-517. Then again I feel like this could be part of pep517 package, but just an extra option of --deps no-install (over the default of --deps pip).
We should not keep pep517 in the name. Long term, that’s nothing but obscurity, and it’ll become more and more inaccurate as more PEPs contribute to this area.
I think build is fine but risky. Pybuild is probably best.
But we could also consider eventually making something like this standard and make it a command line option of Python itself. Then the module name doesn’t even matter - “python --build” can do whatever it likes without relying on sys.path.
I was very excited by this at first, but I realize now that this doesn’t do quite what I was hoping it does. The way it works at the moment this is not a suitable tool for us to promote to end users, because it requires that you execute it in an environment that already meets the build requirements, which violates the spirit and explicit recommendations in the recommendations for build frontends section of PEP 517. This tool seems like it is only useful as part of package managers that handle creating their own build environments.
I’m a big tent kind of person, so I think having a “build tool for distro packagers” in the PyPA org is not a big deal, but I think that unless this is modified to run the build backend in an isolated environment using pip to satisfy dependencies by default (with an option to skip this step if desired), then we should be very careful about the messaging here.
The biggest problem is that people don’t read things and they don’t understand what the PyPA is. People take anything in the PyPA org as “the official way to do X” and if they see a tool called build or pybuild, they may think that this is the official replacement for setup.py bdist_wheel sdist, which it cannot be.
So really, the problems I see are the extremely generic name (which gives no indication that this is a very “inside baseball” tool) and the fact that this almost but doesn’t quite solves a major missing feature from the Python packaging ecosystem. I fully acknowledge that it’s a bit unfair to say, “We can’t bring in your tool because it is too similar to a tool that we want to exist but haven’t built yet”, but on the other hand, that is a real problem.
Personally, I’d love to see this resolved in favor of expanding the scope of build to be the tool for building wheels and source distributions; possibly one that has an alternate entry point for “Bring your own Environment” minimal mode. I realize that’s asking for Filipe to build (or maintain, assuming someone else is willing to build out the new scope) something new that he may have no interest in, so if that’s not in the cards, then I think the focus should be on making sure no one is misled by accepting this package.
My main goal for this was to solve the PEP517 building problem for distributions. There was a hole in the ecosystem, this tools aims to fill it. I believe this is a very valid use case, and as such would have place in PyPA, even if it’s not directly targeted at users.
I did not implement any sort of package management because tools like pip already work in such use-cases. However, I understand that if you are building a lot of packages and some have incompatible dependencies, you may need to set up virtual envs. So, I think it would be okay to change the scope of the project to include that use-case.
If I understand correctly, I would create a virtual environment for each project, install the build system dependencies and then proceed with the build. Is this right @pganssle?
About the name, pybuild is already taken by Debian. @uranusjr suggested casei (see the discussion here):
The term wheel is a reference to cheeseshop , code name of the old PyPI implementation, because the idea is like a cheese wheel, pre-built and ready to be sliced and consumed. Cheesemaking is sometimes called caseiculture (the casei part came from Latin caseus, i.e. cheese).
The only problem I have with it is that it isn’t totally obvious, which I would like, but if we change the scope of the project to make it more mainstream, I think it’s fine.
To be 100% clear — I agree that this is a valid and important use case. The only problem is that it’s a niche one and we aren’t currently recommending any tool for the general case of “build these packages”, which is the common case. The problem isn’t that your tool is scoped too narrowly, it’s that it’s very easy to confuse with a tool that does the common case.
PEP 517 is pretty explicit that you should be building each package in its own separate isolated environment. It’s not required by the spec, but it is certainly a violation of the spirit. Even more important is that build-time dependencies are not install-time dependencies, so they should not continue to be installed afterwards. It’s even called out as a bad idea in the section of PEP 517 that I linked:
I believe that python-build has a dependency on the pep517 library anyway, so I recommend making the default mode to do exactly what pep517.build does, with the exception that if pyproject.toml is missing or if any of the fields in the build-system table is missing, it should be taken from the following defaults:
Well, this tool can be used as such, in fact in Arch Linux we would use it as such. We setup a separate isolated environment to build a package. We need a tool to perform the building, but not the environment setup.
I agree. As per PEP 517 it would be better for the default mode of the tool to install the dependencies in a virtual environment.
python-build already falls back to the PEP 517 recommended defaults.
I would rather use venv to have reusable virtual environments. What do you think?
I think we basically agree here, but that is my main contention, which is that when this is used in “package manager mode”, you aren’t just YOLO-ing it and installing all the dependencies into the global environment, you’re saying, “I will handle this myself”.
I think someone else will have to comment on exactly why pep517 didn’t choose venv, but I can think of a few objections:
venv is only available in Python 3, and this tool seems to support Python 2 as well.
In the vast majority of cases, you wouldn’t need or want or need reusable environments.
A venv can be time-consuming to create (I assume pep517's lightweight isolation is faster, but I can’t be sure).
Considering this thing is going to have a “package manager mode” anyway, the only value you’d get in using venv to create reusable build environments would be that the tool will read the requirements from pyproject.toml for you instead of having to manually get the list yourself. If this is a use case that a lot of people want, then it seems like a make-venv subcommand that creates a virtual environment for you would be prudent.
Another possibility that might be useful would be a subcommand that compiles a list of all build dependencies into a build-requirements.txt file or something. I feel like that might be a useful building block for automating the population of build dependencies in package managers, and also would make it pretty easy to construct a build virtualenv yourself (and this method has the advantage that you get to choose whether you want to use virtualenv, venv, tox, nox or something else to build the virtual environment). Obviously this is not mutually exclusive with a “make me a virtual environment subcommand”.
Yeah. I just want to be clear, as when I re-read my reply I realize it isn’t. This is not a python virtual environment. This is a chroot/systemd-nspawn container with only the required dependencies. Basically a brand new arch installation with only the package dependencies installed.
Okay, that makes sense.
This is reasonable. I will use pep517.envbuild and maybe implement a make-venv command in the future.
That is also an option.
Hum… For now, I think we can use pep517.envbuild and consider this in the future if it makes sense.
Be careful: getting build dependencies is a multi-step process. You first need to install the build-backend, and then run it to ask what else is needed.
In Fedora we were lucky that PEP 517 & co. came when the “dynamic BuildRequires” feature was fairly fresh, so we (Python maintainers) could push the multi-step feature into our build system. But if you do need a build-requirements.txt file before making the build environment (into which you don’t want to install anything during build), it gets complicated.
I was thinking this would be a building block more for a file that generates the, e.g. PKGBUILD file (in the Arch Linux case), so you could build a custom tool that uses virtualenvs and whatever tools you want to figure out all the direct dependencies of your package, then distro maintainers (or whoever) could write a utility to translate between pip-style requirements and Arch Linux requirements, so that someone could write a utility that would download something from pip and generate a first-pass PKGBUILD from it automatically.
Another of the high-importance items on the list from the 2019 packaging summit was building out a system for declaring system-level dependencies, with the idea that there would be a translation layer that distros could populate with a mapping between the “python way of declaring a dependency” and their individual way of doing so. Combine that with a tool that can extract a package’s dependencies, and it could take a decent amount of the grunt work out of going from "package on pip" to "package in ", leaving package maintainers more time to do things like vetting the package.
That said, maybe this is veering a bit off topic. I think we’re agreed at this point that it’s YAGNI for the initial release.
Yes, I also have python-install. At the moment it is not clear if that functionality will be merged to installer. These two tools should make the PEP 517 workflow available for packagers without using pip, which requires bootstrapping dependencies if you do not vendor them.
I don’t think that is much interesting for arch. I am considering writing a tool to populate the PKGBUILD dependencies, but that can use a virtual environment as it would not be part of the packaging process (it would be a helper for packagers to run on their machine).
That is interesting. If you do it, stick to asking for specific files (eg. dynamic libraries: libfoo.so, PATH tools: bash), otherwise you will have a big problem with naming conventions (the FOO project can be called libfoo or foo depending on the systems/distros).
Anyone, feel free to send suggestions .
I have some, but I think they might be too common:
buildr and pyassemble seem the most promising.
Yes, definitely. Anyone that wants to help is free to start collaborating in the upstream. I am open to chat, you can reach me by email (email@example.com) and IRC (FFY00 on freenode).
I actually like the project/module name and would prefer to keep it as is. I wouldn’t expect the build directory to ever have an __init__.py file in it’s root, and the fact that it’s generally recommended to .gitignore the build directory means that python -m build is very likely to not conflict with some existing module in the user’s project.