User experience with porting off

I did read the discussion, and as I have said repeatedly on this forum, I believe that decision was wrong and harmful to users.

The tutorial does not make visually clear what you say it does. But that doesn’t matter anyway, because telling people what changes if you pick a different backend is not something that should be in this tutorial at all. This tutorial is the one piece of documentation on the site that actually purports to tell people how to package their project. It should tell them how to do that, not ask them how they want to do it.

What I’m taking from this thread is that people want other people to make PRs if they think the documentation needs improvement. I think the documentation on that page needs improvement, so I made a PR with my suggested improvements. If you don’t like it, you can reject it, and the docs will continue to be flawed. But maybe this means it would be helpful to have a Python Packaging Council, so that there is a group with a clearly-defined mandate for determining the direction the docs should take.

I want to emphasize here that although I’m frustrated by this process I’m not trying to point fingers at people or cause more hassle to anyone. But I think the docs on have major problems from a user perspective, and I think the best way to improve them involves a rather radical rethink; this PR is one small step in that direction.


I don’t believe the tutorial achieves that aim, because I think there are only two ways to do that:

  1. Pick a backend and tell them to use it.
  2. Include a complete discussion of the pros and cons of various backends so they can choose one.

Right now the tutorial doesn’t do either of those, so the net result is an increase in confusion.

1 Like

That is an interesting article. Unsurprisingly I feel it gives conda short shift. :slight_smile: But apart from that it’s nice to see a Venn diagram like that. Actually something like that for build backends might be helpful for users needing to make a choice on that front.

1 Like

What about updating Packaging and distributing projects — Python Packaging User Guide ? That is supposed to be the guide on packaging, yet it doesn’t mention pyproject.toml at all. If that page was updated, I could see a set of tabs there being useful, and then maybe the guide could have them removed. Though I still think making the guide visually generic is useful, and it makes me happy to point at it from a build-backend’s docs. It’s really obvious what needs to change to switch backends. I’ve never seen a reader who doesn’t already know packaging complain on packaging-problems about this tab set, only people here trying to “protect” other people from being confused. But if there was a good guide page, making the tutorial just a tutorial is reasonable. But not before that, IMO.

I think one of the big problems is that there are a lot of unmaintained pages with outdated info on Someone looking for packaging info sees a lot of pages, and many of them are terribly outdated. I’m always fearful when new pages are added (especially top level ones like the packaging flow) instead of rewriting the older ones to instead be current and up to date.


Thank you for sharing this. It’s a wonderful blog post and there is a well done talk as well:

Something that I really appreciate is the unbiased view instead of a tool-centric view.


One of the problems with consensus-based decision making (or, in fact, any sort of decision making!) is that sometimes the decision isn’t the one you want. At that point you have a choice between accepting and supporting the decision, trying to make it work in spite of your reservations, or simply repeating your assertion that the decision was wrong, in the hope of being able to say “I told you so” at some point…

Would you be so quick to say this if the council supported the current view on how the documentation is structured?

For that matter, the committers for are, in effect, the council for that site. So you’re basically just suggesting that a different decision making body would be better. Why, exactly?

I don’t particularly like your PR, because it promotes hatchling, which isn’t particularly my preferred backend. And I don’t think it’s a particular improvement over what’s there already. But it’s not my choice to make, ultimately, it’s up to the documentation maintainers (or maybe a future council, if they have the authority to overrule a project’s maintainers).


Are we okay with the only tutorial on PyPA is a packaging tutorial that requires the users to learn about what a build backend is and go figure out how to choose one? Is this what we want for first-time package builders[1]? Why only have one tutorial[2]?

The hybrid approach of including too much for beginners and too little for experienced users is making this a bad tutorial for all. Just as an exercise how well does the PyPA packaging tutorial follow diataxis’ tutorial?

  1. Given you can perfectly build a pure python package without specifying the backend at all. ↩︎

  2. I’d like to see a tutorial that is frontend-centric that guides a beginner without mentioning backend. Perhaps an excerpt at the end to point them to a advanced tutorial or a dedicated backend guide. ↩︎


IMO, one place with 5-7 separate topic guides / howtos rather than “tutorials”. FWIW these are the ones I’d personally love to see:

Topic Guides

Packaging a pure-Python project with basic defaults

Walk through the simplest possible case. Given some of the more important definitions. This needs to mention that backends, etc exists, and link to other guides. It is important to convey to beginners that if they hit a wall with simplest things, that there is more to look into, and some idea where.

Using and choosing build backends

Has many definitions: what is a backend, why does it matter? Discussions and comparisons of common/popular backends and their configuration, at least one full example, and lots of links to backend projects docs.

Handling advanced cases with compilation steps

A couple of full examples, discussion and links of tools like scikit-build or advanced build capabilities of other backends. There’s lots to potentially cover here, or at least mention and link to things about: cython, wrapping C, C++ or Fortran, building/bundling Typescript. Maybe suggest DPO as a place for deep technical questions.

Modernizing legacy usage

Address all the things that came up for the OP here, links to the other guides as appropriate. Quick tips/FAQs for common things e.g. “what do I do with these command line arguments?”

Publishing packages

Information about PyPI and twine.

For all of the above, having corresponding GH repos that users can clone and immediately toy with would, where appropriate, be really nice as well.


3 posts were split to a new topic: Is it possible to go back to setuptools and as a frontend?

Captured in this issue: Add topic guides based on Discourse Discussion · Issue #1334 · pypa/ · GitHub


I wanted to chime in that I’ve been very encouraged with the activity on this thread!

I had typed up a lot of content that I ultimately stripped from my blog post on what I thought should be done and folks here seem to be gravitating towards a lot of what I was going to say.

In case you missed it, there was some additional discussion on this blog post on Twitter/X,, and HN. The largest themes I got were:

  • Me too. I’m clearly not alone in this boat. Some people even said they gave up porting off python because they couldn’t figure out how to do it.
  • There was a lot of sentiment that authoritative guidance on what to do in Python packaging land is severely lacking. That good, trustworthy, modern documentation is hard to find.
  • Lots of people incorrectly believe that all of setuptools is deprecated and they need to delete
  • People seemed to be largely dissatisfied with the extra complexity from the introduction of pyproject.toml. However, I think the reasons are highly varied. (Some people just don’t like any change. Others are complaining about the lack of porting docs. Etc.)

In addition to the topics discussed so far, I want to raise a few more from my post.

Lack of a Build Frontend in the Default Distribution

I don’t fully understand why a build frontend isn’t present in Python distributions by default.

I think that shipping a build frontend in Python distributions could make things vastly simpler for end-users by eliminating a lot of cognitive overhead with having to think about which build frontend to use and how to install it.

Many languages do things this way (Ruby’s gem, Rust’s cargo, Go’s go). End-users seem to love the unified toolchain approach. And the presence of a default tool doesn’t undermine innovation in the larger ecosystem.

Before Python 3.12 (or earlier 3.x releases if we want to be pedantic about setuptools availability), we had the ability to produce sdists and wheels using just the standard library’s distutils + [ensure]pip. But 3.12 fully removed this capability. If we want to be customer focused and ease the transition for existing package maintainers, shipping a [simple-to-use] build frontend in the distribution seems like an effective way to do that.

Securely Installing Packages in pyproject.toml

Are my blog’s assertions about pyproject.toml build system package installation being intrinsically insecure accurate?

This question can be answering by stating how to deterministically bootstrap a Python build system frontend and backend and all transitive dependencies in a way that is robust against new package versions being published and is resistant to content tampering.

Is there any documentation on how to do this bootstrapping securely? Are there any discussions on it folks can link me to?

FWIW I have a half baked idea for package registries to store content digest indexed manifests - think requirements.txt files or poetry’s equivalent - and then for package installer frontends like pip to be able to do something like pip install flask@sha256:deadbeef42... to download a content indexed manifest stating all transitive dependencies to install. This way, deterministic install descriptors can be generated and used for reproducible, tamper-resistant installs. All an end-user has to do is refer to a short, immutable content digest instead of having to maintain the manifest themselves. This is conceptually similar to how OCI (read: Docker) image registries work - image manifest & image index.


2 posts were split to a new topic: Proposed alternative governance structure for Python packaging

build has a dependency on packaging and pyproject_hooks. If they are installed normally, it is problematic because there are other tools with a dependency on them, and you could no longer get a consistent environment in case the packaging or pyproject_hooks version wanted by a tool is incompatible with that wanted by build. Effectively, you would be adding dependency constraints on all Python environments. Vendoring is not good either because build also has an importable API. You would easily get two versions of the same package (packaging and build._vendored.packaging) imported into the same Python process.

I am sympathetic to the desire of a unified toolchain, but I don’t think the stdlib is the best way to go about it.

Also note that it has been the case for a long time that you need an extra tool, twine, to upload packages to PyPI. This is not new under the sun.

This is a legitimate concern.

So? What’s wrong with that? I know of a handful of projects that vendor their dependencies and then import them into package-local namespaces so they don’t conflict with modules from site-packages. Pip is in that list. I admit it is a crude and somewhat less efficient solution. But it works.

For the rare project that wants to import build as an API and also needs to use packaging and/or pyproject_hooks, it seems to me that build could gain an API that forces it to import the global version of dependencies to avoid the multiple version problem. Again, a pattern I’ve seen in the wild.

I also believe this is sub-optimal and somewhat user hostile. But the set of people who want to upload packages is smaller than the set who want to build them is smaller than the set who want to install packages. So it isn’t the highest priority to address.

I will note that at the point there exists a unified packaging tool that does everything under the sun, this problem becomes moot because you must ship a tool to install packages out-of-the-box and if that tool also does building and uploading, you get those features for free without having to debate whether to include another tool.


Pip does not have an importable API, though.

You also have to consider that packaging standards are quite in flux, and if everyone gets build preinstalled (or if it even becomes part of the stdlib), then rolling out changes will be much harder. Pip has “please upgrade” automatic notices to help with that, but as an installation tool, it is priviledged.

1 Like

I pinned versions of build dependencies in my pyproject.toml and the OpenIndiana package maintainer filed an issue that it breaks downstream packaging if my pinned version is different from whatever OpenIndiana is using.

This is exactly the kind of [undocumented] negative downstream packaging effect I was worried about when adopting pyproject.toml :confused:

So I guess as a package maintainer I have to choose between determinism and the convenience of downstream packagers.

If I ignore pyproject.toml and just invoke python build from a [deterministic] virtualenv [using a requirements.txt with all versions pinned], I can have both.

Maybe the problem here is downstream distro packagers aren’t yet using modern packaging workflows. But something tells me they won’t like the new workflows for a myriad of reasons (including the non-determinism and the fact that the build frontend really likes to download things from the Internet).


You might want to read this discussion: PEP 665: Specifying Installation Requirements for Python Projects . PEP 665, proposing a lock file format (with hashes for security), was rejected due to lack of sdist support.

Yes, this is a key problem with pyproject.toml. It is actually not generic, the constraints you write in it are specific to how you build wheels for PyPI - they do not apply for other distros. This is not really documented anywhere, and widely misunderstood by distro packagers.

This is not a pyproject.toml (metadata) issue, but is because of build isolation. You can still have both here, just set your virtualenv up the way you did before and turn off build isolation with, e.g. python -m build -wnx.


This is a significant problem because there is no other way to communicate this information to distros. I would have thought that the dependency spec in pyproject.toml should represent compatible ranges such that if everything is built from source then the build should be expected to succeed.

Separately of course there is a need to be able to pin specific versions of dependencies for generating the specific wheels that go to PyPI. Ideally that should be specified in some other way though. Maybe that means that you need a separate pyproject.toml file for e.g. cibuildwheel or is it possible to configure these in a [tool.cibuildwheel] somehow?

If I understand correctly numpy uses version pinning because of ABI compatibility between PyPI wheels but there should really be a different way to say “this wheel requires this exact other binary wheel”.

Not well. Pip’s vendoring causes problems for Linux distributors who don’t like vendoring (for legitimate reasons), and debundling causes problems for pip.

It’s the best we can do, not a good approach to recommend.