Step one would be to add the proposal to that list. Step two (likely needed before anyone would fund the work) would be to precisely define the deliverables. From that point, deciding on what resources were needed and how to find them should follow fairly naturally, alongside the process of finding someone to provide the money.
For a long time now, I’ve wanted to do something like this. I wasn’t aware that it had previously existed. I certainly think this is a useful direction for pip to take, although I’m not sure how close that gets us to what people mean when they say “environment management” - as usual, the difficulty is likely to be in agreeing what we actually want, in sufficient detail to deliver it.
We already have something like that with pip --python, which runs a second copy of pip using the target environment’s interpreter (basically re-using the isolated build environment code). So installing pip in every environment is no longer needed. It’s still done because (a) inertia and legacy expectations from people, and (b) tools that expect to be able to run pip in a subprocess don’t have a reliable way to find pip if it’s not installed in the environment.
Hmm, is it just running the whole pip inside that target environment? That feels more fragile that just interrogating the environment and running Pip outside of that target environment… but in any case, as long as it doesn’t depend on pip being installed in that environment then that’s excellent. I had missed that feature had already been implemented.
I don’t think the -E flag, which had a UX similar to that of --python[1] is what people mean when they say environment management. I think that the ability to install into a Python you’re not running on is, or well was, the main technical blocker to doing so. With that ability, the problem then shifts to defining what it is we want pip to actually do.
Like just as an example, if we decide that something like PEP 704 is the way forward, we could actually modify that such that instead of erroring out for people. pip locates where the environment should be, creates it, and then runs as if --python .venv had been run.
Given the existance of --python, I think the hardest parts then become:
Getting agreement that evolving pip to be that unified tool is a good path forward, or at least could be a good path forward.
Getting agreement on what our desired end state actually looks like.
For the first of those, I think we can make a rough consensus decision on this thread. It wouldn’t block the ability of other tools to continue to exist, iterate, and compete. It would just be declaring that we view a future where pip was the primary tool for interacting with Python packaging for the 80% use case as a reasonable outcome. It wouldn’t be any sort of mandate, it would just really be keeping pip where it is now, but extending it so that people don’t have to also learn twine, virtualenv, etc.
If/Once we had that rough consensus, the for the second of those, I think maybe we’d be best served by taking proposals for what exactly we think our endstate goal should look like, what commands exist, what do they do, etc. Then weigh between them and figure out what path we can take to get from where we are now, to where we want to go.
Then we would just need someone(s) to look at those proposals, and pick one as the roadmap for unification. That could be through the PEP process, or it could just be treated as a pip issue and let the pip maintainers select one. Then we “just” work towards that end goal [2].
Except implemented in a bad way, it executed the pip that was installed in the target environment. ↩︎
Of course, we can mutate that end goal as needed if we change our mind or something becomes more obvious as implementation happens. ↩︎
The new pip build and pip publish can be “clearly” end-user facing, and pip wheel might benefit from getting an alias to pip wheelhouse to reflect that it’s intended for creating a directory full of wheels.
I think we can get a long way with changes to help and documentation to separate the usecases/workflows clearly + communicating about this.
Right now the scenario that I have in mind from a UX point of view would be something like this:
1. Some kind of “bootstrap” tool:
easy to distribute, install, update, and use
possibly a single file binary (not necessarily written 100% in Python *)
release schedule independent from Python versions
feature scope and use cases:
(make sure to cater to the simplest cases only, no feature creep)
install and manage Python interpreters à la pyenv
possibly others than just regular CPython
install and manage (lightweight) Python applications à la pipx
consumer of lock files
run Python scripts and commands à la py launcher
this could be the place to include support for __pypackages__ without changes to the Python interpreter itself
*Most features would require a Python interpreter to be installed anyway, so once this is done the work can be delegated to some Python code executed by the Python interpreter
2. A “developer” tool
From my point of view this could be hatch, pdm, or poetry. I think I would prefer this tool not to be pip (but if it is, it is fine by me). Maybe pip should stay focused on what it already does best, installing things, and maybe even unlearn things like pip wheel.
I think PyPA should pick one (hatch, pdm, pip, poetry, or whatever), not overthink it too much, and then slowly build it up into the thing that covers “the opinionated PyPA workflow(s)™”.
PyPA should choose, document, and recommend things like the project directory structure (src-layout, .venv at the root, tests?), handling of janitorial tasks, and so on. This step is important because then when a project does not adhere to this workflow, PyPA can say “sorry not supported by us” and move on (focus on the hard things: lock files, hard to compile dependencies, metadata override).
On the other hand PyPA should still work on writing the standards and libraries that are less opinionated, in order to nurture a healthy competitive ecosystem for things that PyPA does not want to (can not) support.
I guess a good rule to decide if a feature should belong in the bootstrap tool or in the developer tool would be whether or not the task requires writing Python code. For example if I want to use httpie, I should be able to do it without using the developer tool. If I want to create a library I should not be able to do it with the bootstrap tool.
I think the bootstrap tool was already described earlier in this thread (was it under the pyup name?). I would like to see such a tool, I would most likely use it every day.
Yes to finding a technical project manager (to write down specs and requirements), then get funding, and finally hire developers.
Ok, so packaging does some of what i want. But not all, and the platform tags it generates differ from what cibuildwheel creates.
It seems to me that this situation is crying out for standardisation in the standard library. It’s the only way that things like cibuildwheeland packagine can be made consistent with each other.
Can i ask where this is being worked on? Is it in packaging or somewhere else?
Thanks for explaining. This shouldn’t preclude providing the basic low-level packaging functionality in the standard library though, and i suspect this would be of enormous benefit.
Disclaimer: I am the creator of Poetry and one of its current maintainer.
I would like first to ask a question: is it the role of the PyPA to endorse or promote a single tool? Is this even needed?
As far as I know, other languages do not have a packaging authority and it did not prevent their community to rally around a unified tool.
Poetry is the perfect example of this: it was never endorsed by the PyPA and was even in direct “competition” with the PyPA-backed tool at the time (pipenv) and it did not prevent it to thrive and gain traction making it the second most downloaded “packaging” tool (behind pip, obviously, and before pip-tools and pipenv) today.
Poetry now has a presence, a community, a great team of contributors and is popular enough to be seen as a potential unified tool. And now that Poetry supports plugin, it can be extended to support more use cases that are not part of the base workflow that Poetry provides (monorepo with workspaces – even though it might ultimately make it to Poetry at some point – or npm-style scripts support).
I know some of its detractors have been vocal about Poetry doing its own thing and not supporting standards but bear in mind that Poetry was started before some of these standards even existed. However PEP 621 support is coming along with the depreciation/removal of non-standard dependency specification operators. With the user base that Poetry now has these kinds of migrations takes time to plan and do well to ensure nothing breaks.
Regarding extension building, Poetry gives free reign to build them how you see fit while using its own build backend (poetry-core). You can specify a build.py script in which you can pretty much use any tool you like. Here is an example with Meson: pendulum/build.py at master · sdispater/pendulum · GitHub. You just have to add your build requirements in the build-system section in addition to poetry-core and that’s it.
I am obviously biased but Poetry covers a lot of the needs most users might have and its popularity shows that. The fact that similar tools released after it did not gain as much traction is proof enough that the differences, even standards support, were not incentive enough to make the switch. This is especially true for Hatch which does not support lock files which is something users want and need, so switching to it would actually be a regression.
Do I advocate for Poetry to be this unified tool? In part, yes, but in the end I can’t make this decision for the users. What I know is that we are trying to build the best experience to make building and managing Python projects as intuitive and fun as possible. That’s what matters, the rest is secondary.
It’s one of the most common things that people ask for. Do we need it? I mean obviously the status quo works, but it solves a common problem people have with the tooling.
Rust has a packaging team, Go’s core team invented Go Mod, there’s probably other examples.
I don’t think anyone is suggesting preventing there to be options. The question is whether there should be a recommended or “default” option, not whether we should provide an only option. Obviously users want it, and they don’t feel well served by the status quo.
Taking a step back, I think right now is not a good time to even choose a tool as others have mentioned. I would like to strongly emphasize that the Python packaging community is fundamentally missing features that the future tool would be expected to have/improve upon.
Concretely, on the user facing consumer side we don’t have a lock file and that is a hard blocker. On the building side we don’t have a standardized way to build extension modules for build backends and, as the Conda folks have pointed out, the situation is super complex (my assumption is that we will never support every use case with ease, but we can come up with ways to easily solve the majority of these cases).
I think that we should really focus on these fundamentals first. If we do not, I think we’re not going to make any progress (kind of like how we aren’t making any in this thread).
Consultation and a lot of talking. Otherwise the SC could help set up a packaging advisory committee or something if you/Paul felt more comfortable with that.
I don’t want to side track on this topic, but:
If by “spearhead” you mean “invent”, that’s not going to happen because I personally don’t want that, but I am willing to help push anything this group rallies around.
If people don’t already trust me to do the right thing for this community, then I don’t know what more I can do to convince them to trust that I always have the community’s best interest at heart.
Haters gonna hate.
The building is separate from the rest since that comes down to what you put into [build-system] in your pyproject.toml, so there’s nothing to distribute (especially since you have to download your build dependencies anyway).
For me, I think when people say “environment management” it covers:
Creating
Selecting (as in how to specify which environment to use when there are multiple options)
Deleting (if its location is non-obvious)
environments. After that, because pip has historically been installed into environments (and conda/mamba handle environment management already), I don’t think flags on pip to point at specific environments has been what people have thought about.
That’s going to come down to your build tool then since cibuildwheel isn’t directly creating anything, but instead driving the build tools for your project. But at this point, packaging is as close as you get to a standard library for packaging specs.
I understand why you think this, but this isn’t going to happen to the extent you’re thinking. We are removing distutils from the stdlib for a reason already, so going too far into pushing things into the stdlib would be taking a step back.
Now, having said that, somehow making it so interpreters provide details about their wheel tags directly has been discussed, but that’s off-topic.
packaging
This is getting off-topic for end user UX, so I’m going to say that I personally do not support moving large chunks of packaging library code into the stdlib for a myriad of reasons and ask that other questions on this topic be done in a new thread.
I don’t think getting these features is going to change the situation. People who prefer X tool are always going to advocate that tool, and people who prefer Y tool are going to be grumpy if X tool is selected as the recommendation. Adding more check marks on the feature matrix isn’t going to change that.
I also don’t think it’s critical. It’s not like we’re choosing one tool that we’re freezing in time, and everyone must use that frozen in time tool forever. We’re choosing what tool to recommend, by default, that will continue to evolve and get new features, bug fixes, etc.
All of those things are technically possible to add to pip, even in a way that doesn’t break the usage of pip from within another environment manager like conda.
Here’s my argument for why it makes sense to just evolve pip:
We’ve, both the PyPA and the community at large, already chosen it. The community chose it as the default way to install stuff prior to the PyPA being anything more than a tongue in cheek joke about the lack of an authority. Python Core chose it as the default way when they accepted PEP 453 and bundled pip inside of Python. PyPA chose it when documentation was written on packaging.python.org and on pypi.org.
So, IMO, the decision has already been made. However, user’s want more and we’re trying to retread that ground while ignoring the tool that has the vast bulk of community consensus around it already. On top of that, by reopening the discussion about what tool should be the recommended tool, we put ourselves in an unwinnable position.
Our only real choice here is whether we want to meet users where they are already, or whether we want to try and convince everyone to abandon the tools they’re already using to use something completely new.
In other words, we can more easily move an ecosystem by incremental improvements to pip, then we can by boiling the ocean and trying to get everyone to migrate to something new.
I think we should also consider the practicality of adding all of the required extra features to pip. It would be a massive undertaking and would likely deter contributions (at least I would not want to). Having a tool with the right UX calling tools that specialize (like pip for dependencies) is in my mind orders of magnitude easier and won’t take several years.
I would argue it somewhat changes things because it makes switching tools easier. Look at what a project had to go through to change build tools before and after pyproject.toml. If we can agree on the feature matrix we want to hit and get standards around them, then the default/baseline UX becomes more common and various tools are really driving experimentation and unique workflow needs instead of people having to use them for the same workflow that just happens to be implemented different due to the lack of a standard.
I’m definitely not arguing against evolving pip to handle these virtual environment cases. I just wanted to make sure people were on the same page in terms of what “handling environments” means.
I don’t think I understand this statement. You’re saying you’re currently willing to contribute to pip, but if we decide to evolve pip to unify those things, you’re now unwilling to contribute to it? Or are you saying you’re willing to contribute to a unified tool, but only if it’s a greenfield project?
In any case, I think adding features to pip (something that has occurred regularly for about 15 years) is more feasible then us all deciding to bless a single tool besides pip as the recommended thing (something that has never happened, other than for pip itself).
What I mean I don’t think implementing those features is going to change the difficulty in making a decision. Like let’s wave our wand and pretend we have lockfile support standardized and everyone now supports it.
Great, how does that help us decide which tool to bless? Are we concerned that whatever tool we decide to bless won’t implement a hypothetical lockfile standard in the future, such that we need to have it done prior? Do we think that implementations are going to differ so wildly that what tool we choose is going to hinge on exactly the semantics of how they implemented lock file support? Are we afraid that if we pick a tool and bless it, that they might come up with their own non standard lockfile format?
In all of the above cases, I think the pip developers have shown that they’re going to implement the standards that get defined, that they’re going to do it with an eye towards compatibility, and they’re going to avoid introducing new, pip specific features, that should be standardized prior to such standardization existing.
In other words, I don’t think those extra features provide any extra information to inform our choices here, it just serves to delay making the choice. In some cases in the past, we avoided making a choice because ultimately we didn’t want to, and instead we implemented things like PEP 517 which allowed choices to be pushed onto the users. In those cases delaying making a choice was a good thing.
However, the status quo both before and after this proposed idea is that choices are still all wholly possible for end users, they can choose pip, or hatch, or poetry, or something completely different. The problem is that end users do not feel adequately served by the status quo, specifically because it requires them to make choices. There’s no way to avoid making a choice here, other than by pushing that onto a user base who have clearly communicated that they do not want that.
With that in mind, wait until X, in my opinion, only makes sense if X is going to alter the outcome what choice we make, which I don’t think any of the proposed or hypothetical features are going to, because those are all standards that we would expect any choice we make to commit to implementing.
Yes that is correct, for the same reason I wouldn’t want to contribute to Chrome if it decided to add features for task management like Jira. That would simply not be worth my time to rearchitect an existing large code base that has generally a singular purpose into a general-purpose jack of all trades.
Ok. That doesn’t make much sense to me as a stance, but I understand what your stance is here.
I’m personally not too concerned about it. I don’t think pip doing these things is some crazy out there scheme. They’ve been a common suggestion by various people for something like 10+ years, I don’t think there’s going to be a large number of would be contributors swayed too much one way or the other about it, and that same argument could be made for practically any feature added to any tool, because generally speaking nobody ever fully agrees on where exactly the lines are drawn between what “purpose” a specific code base serves.
I understand your view but that also doesn’t make much sense to me Allow me to express my point in a different way:
Do we think the backends of Flit, Hatch, Poetry, PDM, etc. were created just for fun or because PEP 517 told us we could? No, it was because setuptools was too difficult to contribute to. And consider in that case that is merely improving upon its central and only purpose of building packages. In the case we’re talking about here we’re in a code base of equivalent size with even more complexity and we’re talking about not just adding new features but fundamentally changing what it does/is.
I don’t think there is agreement that this is a desirable goal.
My understanding from PEP 517 to today is that each build tool (that compiles extension modules) can be configured in any way it needs or wants, and people will follow tool-specific docs to know how to configure it. As long as it integrates as a build backend then pip will be able to build the project, and that was the goal.
Your recent message about standardizing metadata (source files and compiler commands) to abstract these extension module build tools is the first time I recall the idea coming up, and I haven’t seen enthusiasm for it in replies.
I will say I am quite confused why there is not much enthusiasm or understanding of the rationale behind that proposal.
Perhaps it is because I maintain a build backend that my understanding of the internals is blinding me to others’ views. In my mind and how the process works in reality is you have a component that selects what files are to be included in an archive and some other component that interacts with a compiler to generate some files. There is no reason at all that they should be the same component.