Building distributions and drawing the Platypus

PEP 517 is out in the wild and, now, we should figure out what UX we want to provide to our users for building packages going forward.

Status Quo

Our current answer to anyone who asks “how do I build a distribution for a package that uses PEP 517?” is pep517.build. It does the right thing but pip and pep517 don’t share the a lot of the underlying logic, especially around build isolation. Further, pep517 isn’t intended as a user-facing tool but rather a library shared by projects that like pip.

Options at hand

As far as I can tell, there are 3½ main options on the table:

  1. pip build
  2. twine build
  3. Independent tool
    3a. A wrapper (let’s use the name pyp) that provides the overall experience we want to provide.

An argument for pip build is that pip already holds all of this logic. It currently does handle building wheels from a source directory and, the intent is to have pip do: repo -> sdist -> wheel -> install as the flow for installation of packages. However, this will cause user confusion between pip wheel and pip build due to an overlap in functionality (both produce wheels) even though one is for building wheelhouses and the other for building distributions from source (it is possible to address that, even if not immediately).

An argument for twine build is that pip is for consumers and twine is for publishers, and folks building distributions are publishers. However, twine currently does not build distributions and only operates on already existing distributions.

An argument for an independent tool is that since “build” fits and doesn’t fit, in both pip and twine; it would make sense to have an independent tool that can then be built upon by a wrapper tool that provides the experience we want to provide to our end users. However, this introduces yet another option in the ecosystem and there are complications w.r.t. separating the build logic from pip [note 1]. Not to mention, someone would have to write pyp. :slight_smile:


@dstufft provided related thoughts (and triggered me making this thread) in a comment pip’s issue tracker about pip build, where he points out that choosing between these three options is dependent on how we want our tooling to look going forward; including questioning whether the unix-like philosophy makes sense for us.

To that end, there was a past discussion/conversation on distutils-sig about “packaging elephant” (it’s a platypus now) which is related. That conversation was started by @njs, including @pf_moore, @cjerdonek, @techalchemy, @uranusjr, @takluyver and more. (yes, that was an excuse to @ people)

/cc @pganssle @ncoghlan

[note 1]: build environments need an installer to set them up, which should be the same installer as the one which would install the package that’s being built.

5 Likes

IMO twine only makes sense as a separate command for the twine developers. It is a good reason, but it does not make sense to the end user. If pip upload downloaded, installed and passed its arguments along to twine it would make people happy. The UNIX philosophy “fragment and die” is bad. Similarly I’m perplexed by the pushback towards pip as a build system. It has always contained as a build system. So a pip build command that was implemented nicely as a library call would be fine with me.

The pip wheel and pip build distinction is a subtle one.

My answer is a combination of 1. and 3. pip likely should gain this ability anyway since a number of users will likely want to do it at some point, and including it in pip is the most friendly way to deliver the functionality. But we should try to keep make pip’s implementation available for alternative frontends, so we don’t dig ourselves deeper into the pip-as-de-facto-standard-but-not-actual-standard hole we’re already in. Maybe it’d be possible to migrate pip’s internal to use pep517, and let pep517 serve the purpose.

IIRC twine was intended to be a playground for things that would/could eventually land in pip, so pip build is a logical choice to me. I’m not convinced having separate tools for publishers and users holds up scrutiny and is just a side-effect of the difficult development history of the PyPA stack.

The differences to pip wheel are superficial and can be overcome by deprecation and/or UX improvements (e.g. optional pip build -r <requirements file> parameter).

Definitely – regardless of which option we take, making it easier/possible for not-pip tools to do these things is definitely a good idea. I was actually curious if we could repurpose the name packagebuilder for the underlying support library here, maybe with pep517 being an underlying library for that?

One reason I am somewhat in favor of both a general wrapper tool that does everything (which is what a lot of people seem to want pip to be) plus individual tools that do just one thing (UNIX philosophy) is to make it easier to explicitly declare the dependencies you need or want (and to allow constraining those dependencies independent of one another).

In my tox.ini, I would like to say “the upload command depends on twine, the build command depends on buildtool”, and I’d also like to be able to independently constrain the versions I’m using. It happens often enough that some version of pip or twine breaks your workflow, but you may need or want the latest version of buildtool. Having individual tools that I can pin and depend on directly is a real benefit to me.

I also agree with @dstufft’s comments on the pip build thread that Unix philosophy and adding more user-facing tools does confuse people. This is why I’m in favor of one user-facing wrapper tool that people would install in whatever environment they are working in and they can point to it as “one simple tool that you can use”, with the individual tools twine, buildtool (possibly tox and virtualenv) being somewhere between tools for advanced users and plugins for the wrapper tool.

I think that people would be happy if there were a tool that did this and they probably don’t care if it’s pip or not. I also think that there are a decent number of good reasons for pip to restrain its scope:

  1. It has a lot of well-established behavior that was designed in the context of being an installer, not a multifarious build tool. This could limit our ability to create a clean interface (see, for example pip wheel).
  2. It is already vendored in the standard library and needs to vendor all of its dependencies. This puts constraints on it that a purely free-standing tool would not have.
  3. By incorporating everything directly into pip, I think we are in some danger of recreating the whole “unwritten de facto standards” problem. If we design a tool that is explicitly made to have good defaults with modular pluggable subsystems, we’re in some ways forced to stay on the “good path” by keeping specifications of interfaces as part of the design process rather than as some post-facto documentation of how a given tool works.

By and large, the benefit we get from rolling everything into pip is that we don’t have to go through another round of churn and documentation of the new “best practices”. The downside to rolling everything into pip is the burden of backwards compatibility. Given the roiling and chaotic state of everyone’s understanding of the packaging ecosystem, I’m not sure that we’ll have a much easier task telling everyone to use pip subcommands than we will telling everyone there’s a new omnibus packaging tool that rolls up all the functionality of your favorite build tools.

3 Likes

Considering the overhead of introducing a new tool I’m sure it would be easier to just say “Users! pip 20.x can now build your projects, too!”. I’m definitely leaning towards carefully considering xkcd://927 here.

OT: @jezdez I already linked to that comic above, in a similar context (see “yet another” in OP). ^-^

I saw! :smiley: Discourse even happily reminded me of it before posting but I wanted to reiterate that the struggle of reinventing the wheel (no pun intended) is real. Thanks for raising this issue :blush:

Oh, FWIW my personal position is that I’m in a limbo between the three options. The main reason I’d started with “let’s do pip build” is because it’s a code base I have access to, some familiarity with and that’s where all the code lives today (albeit it’s not super approachable code).

I’m willing to put in some cycles to do the work here, given we can figure out how we want to go about doing this.

This doesn’t really apply for a number of reasons. For one thing, we’re not trying to get rid of additional things, we’re trying to make there be one obvious default option from among the options that we ourselves are offering (for example, we are not looking to subsume or eliminate conda or poetry), we’re trying to decide on what a good user interface looks like.

I’m also not proposing that pip would go away and be completely subsumed by the superset tool. I’m specifically proposing that having a single wrapper tool would be a simple and pithy answer to “How do I do x?” for people who just want whatever the default is. People can adopt or not adopt it as desired.

The main concern is actually communicating the relationship between all the tools clearly and loudly. I think having a “wrapper tool” that rules them all actually solves that problem cleanly even though it does make it so there are at least two ways to do literally everything.

I’m leaning at the moment at twine… It’s literally the package I install pep517 at the moment alongside every time :thinking:

While I personally don’t love the idea of pushing everything into pip, it’s kind of where the community seems to want to take things. And if that’s true, then Paul’s comment seems to align with the idea of doing a git-style command expansion support for pip where people can externally provide commands that they deem useful, but without having to ship it in the box. What way we get packages which implement the functionality but pip provides the unifying CLI UI that everyone targets with their pip command package.

1 Like

I don’t see it so different since a new tool would still need to become some kind of “standard” (be documented, well-known and technically solid) and get traction in the community to be achieving this goal.

What I tried to get at is that it may very well solve this problem now, but doesn’t prevent having the same discussion at a future date where new use cases (driven by the PEP process) require a different user experience and possibly a new tool. Which would continue to put the burden on the users to always learn how to do packaging.

So it seems like a question of continuing to develop packaging tools via compositing/loose coupling (separate tools) or integration/tight coupling of functionality.

As a historical note, this was proposed in the past and mostly failed at implementation details around subcommand discovery. But with the recent importlib.metadata/importlib-metadata work this may be much simpler to maintain than before.

A pip plugin system might or might not be a good idea; I dunno. (Though I’m a bit skeptical myself; plugin systems always seem like they’re going to solve more problems than they actually do.) But, I definitely don’t think it will help us with the “elephant”/“platypus” project management tool idea – that will require a much more coherent and integrated set of commands, and some of the semantics are incompatible with pip’s existing commands (e.g. install would search the current directory for pyproject.toml and update pins, instead of installing directly into the current environment).

I think you may have missed my points:

  1. We don’t have to worry about the proliferation of “standards” or “tools” because this proposal is not to get rid of all the other stuff but we’re intending to add an additional tool and specifically an additional tool composed of the existing tools.

  2. To the extent that we do want it to replace something else, the thing it would be replacing is something we already control. The only thing we’re planning on replacing is invocation of setup.py for build commands, and we’re going to deprecate and remove those commands, so we’re not in any danger of two competing and active ways to do things. We’ll just have a bunch of outdated documentation to update (which we already have to do anyway when we tell people to use pip instead).

Even though the xkcd comic is basically not relevant it does rhyme with a concern that we do have which is that we have a complicated build ecosystem and adding more tools without taking any away does add to that complexity. My suggestion is that there are good reasons to accept this complexity in the ecosystem as the price of some other desirable features of the Unix-philosophy approach and that the wrapper tool, if marketed correctly, could be a partial mitigation for the complexity problem. We can say, “Yeah, there are a ton of small purpose-built tools for different aspects of packaging, but there’s also one or more omnibus tools that give you a more npm-like experience, so you really only need to know the omnibus tool.”

This is the sentiment I hear most often, but I am still not convinced that the community really cares whether it’s pip or something else, just as long as it’s something, and preferably something “official” (to the extent that that means anything).

Sneaking onto my computer for a few minutes but I wanted to just add:

I think the question of whether something lives in pip or twine or some other random tool that may or may not exist right now to be a bit premature. We don’t even know for sure what “it” it yet, so trying to figure out where to put it is harder because everyone has different ideas about what “it” is.

Likewise, I think whether whatever “it” is is implemented as a monolith, as a wrapper around other more focused tools, or libraries, or whatever combination is also premature, and also an implementation detail of “it”. We’ve learned that monoliths are generally bad, and whatever we do here people are likely going to want to reuse at least parts of it, so whatever we pick is likely going be to implemented in a way that other approaches can re-use them.

So basically, I don’t think we need to legislate where our end goal lives OR that we’re going to implement it with sound engineering practices towards reusability.

So what do we need to figure out? Basically what is the UX we’re trying to present here. The way I see it there are a few possible options:

  1. An all-in-one tool that does everything from installation to upload to building.
  • It does not implement a PEP 517 backend though.
  • Think npm or cargo style.
  • It’s possible that the “overlap problem” (see below) is narrow enough in scope that most commands are top level (install, upload), but then you have a few commands that either have longer names, or are namespaced inside of a sub command for a specific persona (foo install, foo upload, foo build wheel, foo download-as-wheels).
  1. Persona driven tools, aka “I’m an project author” tool vs “I’m a project user” tool.
  • Think pip vs twine + flit/setuptools/etc in the current ecosystem.
  • It’s possible that (2) isn’t even two tools! For example, you could have a foo command where all of the top level commands are focused on the “larger” persona (consumers), and then a sub command group that itself has other sub commands that handles the “smaller” persona (authors). So you’d end up with like foo install, foo wheel (produces a wheel house), foo build {wheel, sdist, upload}.
  1. Unix philosophy, highly focused one tool per task.

Again, to be clear 1 can be implemented on top of either 2 or 3, and 2 can be implemented on top of 3. That doesn’t mean we should pick 3, because we shouldn’t be focusing on how we implement it, just what our default UX is.

Personally I find (3) to be a big no from me. I think expecting users to install different commands for each thing they do to be confusing and complex, and I think our users will hate us for it.

So to me, it really comes down to 1 vs 2, and that’s where I’m not really sure and I could easily argue one way or another here.

In favor of (1) it’s the simplest in terms of end users needing to figure out where some functionality lives-- if there’s only one tool then it all lives there (again, ignoring implementation concerns like splitting out libraries/tools etc). However I think it has a real problem in terms of the varying user personas and what they’d want out of a tool.

For instance, in the “build a wheel command”, a consumer persona is going to want the tool and it’s options focusing on primarily downloading from a repository, and building up an entire directory full of wheels. It’s going to want to fetch the entire dependency graph and ultimately produce a “wheelhouse” that can be installed from offline. However, an author is more likely going to want to produce a wheel from an existing sdist they already have locally (probably from a “build a sdist command”) and isn’t going to want to build the entire dependency graph, just that one wheel. Even beyond that you’re going to have things like, a consumer isn’t likely going to want a way to specify the specific Python tag they expect the generate wheel to have (and since you’re potentially building multiple wheels you’ll need some syntax for specifying different tags for different wheels in that case), but a project author quite possibly is going to want that functionality.

In favor of (2) is that by splitting along the lines of these “personas”, we can drastically simplify the cases where there are overlapping concerns. The current split of “twine upload” would be super simple to add to pip, because a consumer would just never call “pip upload”, so there’s no overlap there. However for the build wheel case, if we split the tooling by persona, we can have the “consumer” persona focused on producing a wheel house from a set of requirements and the “producer” persona is focused on producing a single artifact from a local sdist.

The big downside of (2) is that absent any other concerns, two tools is more complexity for end users than one tool, so we need to figure out if the “overlaping commands” complexity outweighs the additional complexity of having two tools in our “default” UX, or whether the overlap isn’t really that bad and it would be better UX to have a single tool handle it all.

There’s also the question of environment management, and whether that should be part of this discussion or whether these tools should just operate on an existing environment. Personally I’m happy to say that at a minimum, all of these tools are going to need to at least support being invoked against an existing environment, so we can consider environment management at a later date.

Once we figure out what the overlap is, and how bad of a problem it is, and what our ideal shape of the “default” ux is, then we can start figuring out if it would make sense to pip that in pip, twine, or some completely different tool.

3 Likes

I still need to read all of this thread, but I wanted to pick up on this point in particular. While “disk space is cheap”, pip’s vendoring of everything is a genuine issue (there’s a copy of pip in every virtualenv, for example, and on slow network connections, downloading the latest pip can significantly slow down virtual environment creation).

The point of having pip in the stdlib, and including it in virtual environments, is to address the bootstrapping issue of getting to a point where you can install “other stuff”. It’s not to have a big package development and management tool available everywhere. We should be very careful of bloating pip without considering this.

The one thing it might solve is the above problem. Ship a “minimal” pip with the stdlib and virtualenv, sufficient to “pip install” the extra features on request. But the user experience of having to install “extra features” repeatedly is not ideal…

I like Donald’s persona based UX framing, and I also think there are genuine practical benefits to having the “for publishers” tool be distinct from the “for consumers” tool:

  • it means the publishing tool doesn’t need to vendor the world
  • it means the publishing tool can easily have platform specific dependencies
  • it means it is trivial to set up a system that omits the publishing toolchain but still supports installation

So my own vote would go towards “twine build”. If we were to ever add publishing capabilities to pip, they could then be in the form of an extra that depended on twine (thus keeping the above practical benefits), and turned twine commands into nested pip subcommands (e.g. “pip publishing upload” delegating to “twine upload” and “pip publishing build” delegating to “twine build”)

1 Like