Python Packaging Strategy Discussion - Part 1

I would argue it somewhat changes things because it makes switching tools easier. Look at what a project had to go through to change build tools before and after pyproject.toml. If we can agree on the feature matrix we want to hit and get standards around them, then the default/baseline UX becomes more common and various tools are really driving experimentation and unique workflow needs instead of people having to use them for the same workflow that just happens to be implemented different due to the lack of a standard.

I’m definitely not arguing against evolving pip to handle these virtual environment cases. I just wanted to make sure people were on the same page in terms of what “handling environments” means.

1 Like

I don’t think I understand this statement. You’re saying you’re currently willing to contribute to pip, but if we decide to evolve pip to unify those things, you’re now unwilling to contribute to it? Or are you saying you’re willing to contribute to a unified tool, but only if it’s a greenfield project?

In any case, I think adding features to pip (something that has occurred regularly for about 15 years) is more feasible then us all deciding to bless a single tool besides pip as the recommended thing (something that has never happened, other than for pip itself).

What I mean I don’t think implementing those features is going to change the difficulty in making a decision. Like let’s wave our wand and pretend we have lockfile support standardized and everyone now supports it.

Great, how does that help us decide which tool to bless? Are we concerned that whatever tool we decide to bless won’t implement a hypothetical lockfile standard in the future, such that we need to have it done prior? Do we think that implementations are going to differ so wildly that what tool we choose is going to hinge on exactly the semantics of how they implemented lock file support? Are we afraid that if we pick a tool and bless it, that they might come up with their own non standard lockfile format?

In all of the above cases, I think the pip developers have shown that they’re going to implement the standards that get defined, that they’re going to do it with an eye towards compatibility, and they’re going to avoid introducing new, pip specific features, that should be standardized prior to such standardization existing.

In other words, I don’t think those extra features provide any extra information to inform our choices here, it just serves to delay making the choice. In some cases in the past, we avoided making a choice because ultimately we didn’t want to, and instead we implemented things like PEP 517 which allowed choices to be pushed onto the users. In those cases delaying making a choice was a good thing.

However, the status quo both before and after this proposed idea is that choices are still all wholly possible for end users, they can choose pip, or hatch, or poetry, or something completely different. The problem is that end users do not feel adequately served by the status quo, specifically because it requires them to make choices. There’s no way to avoid making a choice here, other than by pushing that onto a user base who have clearly communicated that they do not want that.

With that in mind, wait until X, in my opinion, only makes sense if X is going to alter the outcome what choice we make, which I don’t think any of the proposed or hypothetical features are going to, because those are all standards that we would expect any choice we make to commit to implementing.

4 Likes

Yes that is correct, for the same reason I wouldn’t want to contribute to Chrome if it decided to add features for task management like Jira. That would simply not be worth my time to rearchitect an existing large code base that has generally a singular purpose into a general-purpose jack of all trades.

Ok. That doesn’t make much sense to me as a stance, but I understand what your stance is here.

I’m personally not too concerned about it. I don’t think pip doing these things is some crazy out there scheme. They’ve been a common suggestion by various people for something like 10+ years, I don’t think there’s going to be a large number of would be contributors swayed too much one way or the other about it, and that same argument could be made for practically any feature added to any tool, because generally speaking nobody ever fully agrees on where exactly the lines are drawn between what “purpose” a specific code base serves.

2 Likes

I understand your view but that also doesn’t make much sense to me :sweat_smile: Allow me to express my point in a different way:

Do we think the backends of Flit, Hatch, Poetry, PDM, etc. were created just for fun or because PEP 517 told us we could? No, it was because setuptools was too difficult to contribute to. And consider in that case that is merely improving upon its central and only purpose of building packages. In the case we’re talking about here we’re in a code base of equivalent size with even more complexity and we’re talking about not just adding new features but fundamentally changing what it does/is.

That should be a primary concern.

I don’t think there is agreement that this is a desirable goal.

My understanding from PEP 517 to today is that each build tool (that compiles extension modules) can be configured in any way it needs or wants, and people will follow tool-specific docs to know how to configure it. As long as it integrates as a build backend then pip will be able to build the project, and that was the goal.

Your recent message about standardizing metadata (source files and compiler commands) to abstract these extension module build tools is the first time I recall the idea coming up, and I haven’t seen enthusiasm for it in replies.

2 Likes

Yes perhaps that is true unfortunately.

I will say I am quite confused why there is not much enthusiasm or understanding of the rationale behind that proposal.

Perhaps it is because I maintain a build backend that my understanding of the internals is blinding me to others’ views. In my mind and how the process works in reality is you have a component that selects what files are to be included in an archive and some other component that interacts with a compiler to generate some files. There is no reason at all that they should be the same component.

I think that they were created because the vision of what they wanted wasn’t possible within the constraints of existing solutions. Like there is no world where setuptools ends up shaped like flit, because the underlying goals and desires of those two projects are different.

I think that solving whatever problems people have to contributing to pip is a more tractable problem then getting agreement on something that isn’t pip. These discussions tend to go nowhere, because they devolve into politics about which tool gets blessed as the default.

Pip is already the “default”, so this side steps that. In fact, pip can start adding those features today, without anyone’s permission, and I suspect if they did so the “please provide a unified tool” talking point would just go away, because pip is already the default tool, it’s just implementing the features that people keep asking for.

Further, I don’t really think it’s "fundamentally changing what it does/is any more than adding the wheel sub command changed what it does/is. It’s not like it’s suddenly changing into a blog authoring tool or something.

5 Likes

I haven’t seen these things discussed before but that doesn’t mean that they aren’t good ideas. I’d be very happy for things to get a bit more standardised on the backend side of PEP 517 but for now just getting things working post-distutils is obviously the priority. I expect that later it will become clearer which things can be abstracted.

4 Likes

Aside from different historical alignments, I think it’s hard to ignore that one factor that also played a role was limited reviewing capacity / maintainer availability[1]. Similarly, there are limits to the capacity that pip maintainers currently have. Saying “let’s become that unified tool” (and I agree that pip certainly is the most central today and would offer likely the easiest transition), is like sticking a sign “Stampedes Welcome” on the door, but that doesn’t make the door larger.

So while I think it’s an intriguing idea to flesh out pip in this way, it would IMO have to come with a corresponding update of governance / expansion of maintainership to allow it to realistically grow all those features in something less than “years”.


Perhaps there’s another variant involving pip along the lines of what @rgommers described further up in his blogpost (using a hypothetical pyp) – rather than re-implementing everything in pip, it could delegate a lot of these new tasks to existing solutions.

That’s assuming that we can find the right interfaces, but that way we’d have a unified front-end, and the backends could be switched based on user preference. This wouldn’t absolve us of a choice of one tool per new task as a default, but under the hood (i.e. it would come through pip), and those defaults could change over time based on merit / popularity / necessity.

This is something the conda folks have wanted to do for years and years, to the point of having given up asking (AFAICT).


  1. as is the case for any other project… ↩︎

2 Likes

That is what I have been advocating for and is quite literally why I rewrote Hatch and is how it works currently :slightly_smiling_face:

I agree, without some fundamental changes to how pip is maintained, this would be a very long process. (And I’m not sure what precisely you cover with the term “governance”, but one constraint we’d have to look very hard at is “willingness to break existing users for the sake of future growth”)

Speaking as another pip maintainer, I have a slightly different view than @dstufft. I think that he’s absolutely right that growing pip into the “unified tool” role is a great way to sidestep all of the questions about what tool to bless. But the difficult question for me is how do we get there from here.

Users want a unified tool. OK, but when do they want it? In 3-5 years? Maybe we could get pip to the point of being that tool in that sort of time period. In 6 months? Not a chance. Maintainer bandwidth is definitely one problem (and I can attest to the fact that working on pip, in particular the “maintenance” side, is very draining[1]). Other issues are the various legacy cases we have to handle - removing the “direct invocation of setup.py” code path has been a multi-year project blocking many new developments, similarly for the new resolver, and the idea of a “plugin” architecture and a stable API, which if implemented would help immensely with adding new commands/functionality. And there’s the whole scope question - while we might agree in principle with “becoming the unified tool”, each new command would need discussion and agreement. There’s been a long (again, multi-year) discussion on a pip run command that runs a script with its dependencies available in a temporary environment, for example.

In addition, this could involve significant bootstrapping issues for pip. If we offer new functionality, do we write it ourselves, or do we vendor an ever-increasing number of 3rd party modules? Do we need to re-architect pip to have a small “bootstrap” version that can self-extend to add functionality without vendoring? How else do we isolate our dependencies from the user’s? How will vendoring further libraries impact Linux distributions, who already need to decide how or if they devendor pip? Having a single centrally-installed pip might help here (it’s what other tools do) but when I introduced the --python option, I asked about that and the idea was not well-received (some objections were simply inertia, but there were enough genuine problems that the idea was dropped - specifically the fact that subprocess.run([sys.executable, "-m", "pip"]) is our supported method of running pip from within your code).

To repeat, I agree with @dstufft that growing pip to fill the “unified tool” role is a great option. But I don’t think we should underestimate the challenges.

Equally, though, I think people offering other projects as the “unified solution” may have underestimated the impact on them if they suddenly became the recommended tool and gained a userbase the size of pip’s (the one exception here may be Poetry, which already has a sizeable community, albeit a lot smaller than pip’s).

One final point - we’ve been gradually expanding pip’s capabilities for years now. And pip is already the standard tool shipped with Python. So in some senses “make pip the recommended tool” is the “do nothing” option - and could well be perceived by users as exactly that. If we took this route, how would we explain to users that we’d listened to their concerns and were giving them what they’d asked for?


  1. And I’d be against adding maintainers who only wanted to work on shiny new stuff and ignore the maintenance side, for the record. ↩︎

4 Likes

As I see it, certain areas are currently out of scope for pip, including creating sdists (use build) publishing (twine), setting up environments (venv etc.) or expanding a list of desired requirements into something like a lock file (pip-tools). Perhaps there’s more discussion internally about these, but not being a pip maintainer, my impression is that pip maintainers have largely said a firm no to such things.

I think that has made a lot of sense in the context we have (and I like working on tools with limited scope), but if you announced that pip’s scope was opening up to cover a set of these things that we currently point to separate single-purpose tools for, even if it was going to take a couple of years to flesh that out, I don’t think it would be seen as ‘doing nothing’.

Of course, forging a consensus on what set of things belonged in the new, larger scope would be hard, but it always is, and I don’t think we can address these concerns without some kind of consensus.

6 Likes

As someone who wears that hat, and has spent a not-insignificant amount of time in pip’s issue tracker, I am ~sure that various maintainers at various stages of “history” of pip have said that they are open to the idea but that we need a broader discussion about it. :slight_smile:

In line with that, basically all maintainers who’ve spoken here are open t/supportive of the idea of expanding scope for pip. FWIW, I have a draft blog with quotes from discussions here and on the issue tracker to that end as well.[1]


  1. Perfectionist tendencies have kicked in, and it’s got a larger scope than what it started with (i.e. response on here). ↩︎

5 Likes

I suspect if you asked them, they’d all say “right now” :slight_smile: but I also think that most of them are reasonable, and if they see movement towards it, they’d be happy and see that we’re making progress. I don’t think there is a world where they get it in 6mos no matter what we do if they’re not already happy with the status quo options.

Yea I may have brushed over them to some degree, but that’s largely because I view the challenges of evolving pip to that point to be largely technical challenges, and in my opinion, technical challenges are more tractable than political ones.

I think the beauty of the idea, is that it sort of is the “do nothing” option, like it’s not really doing nothing because we’re expanding pip’s scope and adding the features people say they want in the tool… but that’s something we’ve been doing all along. It’s really just becoming a bit more aggressive about expanding those features to arrive at the destination that people want.

How to communicate that out is always a struggle for us. We could write a PEP on it if we wanted, or put up a page on packaging.p.o, or even just put it into the pip release notes, or just start doing it and let people notice that pip is gaining these features over time.

2 Likes

Maybe (and this is a serious comment) the key takeaway from this discussion is that we need some sort of formal publicity / user liaison team, who are explicitly tasked with this?

1 Like

Yes, getting the message out is key. I think one low-cost way is pinning a GitHub issue(s) of the intent and high-level roadmap and/or placeholder issue(s) for planned milestones.

I will also add, that when pip embarked on another very difficult piece of work (new resolver), there was visibility and comms that made it out there. Sure, perhaps it didn’t reach everyone, but it certainly was heard. Perhaps a similar approach or something can be learned from that experience for this new set of challenges.

One thing that’s unclear to me in this discussion is, what exactly do we want this hypothetical new system to do? I really liked @johnthagen’s example in another thread laying out some concrete examples of things that people might do, but it seems this thread has moved in a more abstract direction.

I think it is not so important to users whether the hypothetical unification is technically “one tool” or several, as long as they are designed coherently as an integrated set. As far as messaging, personally I think one thing that would go a long way toward making users feel like their concerns have been heard and that pip (or whatever set of tools) is responsive to them is a definitive section within the python.org docs that clearly lays out how to accomplish concrete tasks, clearly states that the way it lays out is the official way, and backs that up by clearly demonstrating how to accomplish all the tasks that people want to accomplish but that currently require navigating a labyrinth of conflicting toolsets. (In effect this would be a flowchart that includes choices like “does your project include code written in a language other than Python? if so, then do blah blah”, although it wouldn’t have to be structured like a graphical flowchart.)

3 Likes

To be honest, I do not believe the “expand pip” option. At least, not today. If the pip team gets together, makes an appeal for more help to push in this direction, and comes back in 6-12 months and shows that they make steps and it can get there, then perhaps I would. But the weight of history, the complex and legacy code, the backlog of issues and difficulty of working on pip, and the important lower-level role as a pure installer it already fulfills are already stacked against this idea imho. So by all means try if you want, but it’s a poor conclusion to now decide that this is the way.

Poetry/PDM/Hatch are much more along the lines of what users seem to want. Each has its own problems and isn’t complete enough, however if you’d take the best features of each you’d have about the right thing. Now that has the same problem as for the pip direction above: we cannot agree to pick one here, nor can we wave a magic wand and merge them. But again, that is not needed. It’s encouraging to see that the authors of each of these tools have shown up here (much appreciated @sdispater, @neersighted, @ofek and @frostming!). If you as authors of these projects would get together and work out a strategy, I’d believe in that working out well in a reasonable time frame. And make both Python users and the average package maintainer enthusiastic about this.

I’m not intimately familiar with each of Poetry, PDM and Hatch, but here is what I understand some of the key differences and issues to be:

  • Team: Poetry seems to have a good size team (multiple maintainers; 6 folks with >100 commits who were active in the last year). PDM and Hatch are both single-author projects.
  • User base: Poetry seems to have most users by some distance, Hatch seems to gain quite a few new users, PDM seem to be struggling a bit.
  • Build backend support: PDM supports build backends (PEP 517) the expected way, it seems that Poetry can be made to work by plugging an install command like pip install . into build.py (so Poetry’s approach is even more general than build backends, but a bit clumsier by default), and Hatch has no support at all for anything but its own pure Python build backend (and new plans do not look good).
  • Conda/mamba support: Hatch and PDM seem to have support (not sure how feature complete), Poetry does not.
    • Other package manager support isn’t present anywhere, but if conda support can be added, that leaves the door open for future other package managers (e.g. if enough people care to maintain Spack or Nix support, then why not?)
    • Virtual environment handling seems to me to be a subset of this, or another angle on it. There’s no good default, as the discussion on PEP 704 shows. PEP 582 (__pypackages__) is an entirely reasonable approach. PEP 704 doesn’t add much, but as long as there’s different ways of doing things it probably doesn’t hurt either. The important thing here seems to be to hide complexity by default from especially the beginning user, while allowing multiple approaches.
  • Plugin system: all three projects seem to have a plugin system and a good amount of plugins.
  • Lock file support & workflows: Poetry and PDM have lock file support, Hatch does not. There’s an issue around standardization here, TBD how that will work out. Poetry and PDM seem to conversely have issues with the “package author” workflow, which should not lock and not put upper bounds on dependencies by default. Looks like that needs more thought in projects and something like two workflows/modes: application development and package development.
  • Completeness of commands offered: it seems pretty complete for all projects, although a standardized test command at least would be a nice addition for all projects.

There’s probably more to compare here, but I think it’s safe to say that if these projects would join forces, we’d have something very promising and worth recommending as the workflow & Python project management tool. And it doesn’t require a 200+ message thread with everyone involved in packaging agreeing - if a handful of authors would agree to do this and make it happen, we’d be good here and could “bless” it after the fact.

8 Likes

If I understand Dependencies for build backends, and debundling issues - #29 by ofek correctly, that’s changing soon for Hatch.

I think you’re alluding to them automatically generating a lock file and then continuing to use that file going forward while it exists regardless of whether it’s been checking into VCS?

Trying to standardize that has been brought up before and typically has been shot down (e.g. Providing a way to specify how to run tests (and docs?) ). I think the issue typically comes down to choosing a scope (API or just shell command), and then what to do for OS variances. But I’m a supporter for something like this, so I’m the wrong person to ask why we shouldn’t do it. :wink:

1 Like