As a second, I’ll second this sentiment. And I’ve raised packaging-like questions with the broader core developer group before - the responses tend to range between “what’s packaging” to “I don’t care about packaging” to “they can do whatever they want”. I’m afraid the most support from the core team comes from those of us who participate in this category, which is very few of us.
And there’s definite opposition to tools that change the “normal” way of launching Python. No tool that requires you to type anything besides python[3[.x]] has a chance of becoming the default in the upstream distributions, in part because of core dev preferences but also because the vast majority of users appear to want python[3] (rather than another tool or even py). The best way to get a new tool into the upstream distros is to prove that users are all installing that tool already.
And I’m saying “upstream distros” on purpose, because there’s no reason you couldn’t create a new distro that includes whatever default tool you like. In some ways, the core team would be happier to just release a source tag and let other groups do the builds and distribution.[1] The license explicitly allows this.
What we won’t ever accept is people trying to use our existing “reach” to promote their own tool. If your tool has the reach already, we’ll consider making users lives easier, but you aren’t going to become a popular tool by getting into upstream CPython first. Similarly, if pip started becoming something other than a way to install into the current Python environment, we’d probably be quite happy to drop it.
We’re not happy when users of other distros blame upstream for distro-invented problems… ↩︎
This discussion makes me thing of Node.js’ corepack. I have only looked at it and used very briefly, I have no idea how this works exactly nor how that would potentially translate in the Python world. Maybe it is kind of like ensurepip?
This is my third attempt at writing a response. I’m going to keep it brief(-ish).
I don’t think the current separation between the core devs and the packaging community is healthy - particularly given that the core devs are removing functionality from the stdlib with the justification that it can be accessed from PyPI. If that’s the justification, “making it possible to access PyPI” is very much a core responsibility.
Pip is in the stdlib as a bootstrapping mechanism. The pip devs need to be very careful not to abuse our privileged position to extend pip’s scope in a way that’s unacceptable to the core devs and SC. Of course, this argues again for a healthier relationship between the core and the packaging community, because otherwise how does pip know what’s OK?
IMO dropping pip from the stdlib without replacement is unacceptable. It would damage the experience of far too many users, whose first experience of Python is “install from python.org, use pip to get access to PyPI”. Maybe a streamlined “bootstrap installer” would be a better stdlib component than pip, but who’s going to write such a thing? The packaging community has no motivation, and as you said, the core devs don’t care. Driving the process by the threat that otherwise, pip will be removed without replacement should (I hope) be considered unacceptable.
Other solutions are possible. Making zipapps work better, and shipping a pip (or pipx) zipapp with core Python might be an option, for example. But only if the core devs take a more active interest in the deployment side of the developer experience.
Yeah, totally agree with you on all points. I don’t mean to make it sound like a demand on pip, though I am trying to accurately reflect the discussions that have happened on the core side (which I know you’re also involved in, Paul, though I’d submit from a less neutral position than myself).
I definitely don’t want to see pip removed from core. However, I also don’t want to see pip refusing to install “globally” or requiring the creation of a venv or something heavier-weight - broadly, becoming workflow-oriented rather than task-oriented. Which I believe is the same sentiment you (Paul) expressed earlier.
Honestly, I’d hope that pip would be begging to be removed (for their own project management reasons), and the core team would be terrified of removing it (for usability reasons). Just as I hope that redistributors will start shipping their own Python distros - no volunteer-driven project wants to grow the audience, because that just means we have to spend more personal time on support. If pip views being in the stdlib as just a way to grow their own userbase, that’s unhealthy for the overall ecosystem and we should be discussing it (but I don’t think that’s the view - I think it’s more about responsibility and caring for users).
I expect what would happen would be much the same as any big breaking change: if the core team (release manager) believes that a particular version of pip is too impactfully different from the previous one, it just won’t get updated. And maybe after a long while that leads to a fork and patches for core distribution, and eventual replacement with something else that is just an install command, or maybe we actually have a working relationship and find a way to keep the bundled functionality simple/direct.
Anyway, this is getting way off topic, so to bring it back: one reason pip doesn’t have workflow-like functionality is because we don’t want to ship workflow-like functionality “by default” with the core runtime, as it’s not clear what that workflow should look like.
I don’t know that it’s an “of course” - certainly not in the sense that a for-profit who makes their shareholders income as a multiple of the number of paying customers “of course” wants to grow their audience.
The motivations for FOSS are far more aligned with solving problems. We don’t get any more reward from growing the userbase than we do for solving problems that people need to be solved. So “more users” is a sign that we’re solving the right problem, but generally not the core of the motivation.
And in this particular case, I’m referring specifically to the pre-built distros on python.org for Windows (and I believe macOS, though I won’t speak directly for the guys who maintain that). We are doing those because nobody else is, and if someone else was making a more popular pre-built distro (and supporting their users), I certainly wouldn’t be at all offended.
This is a surprising perspective to me! The mission of the Python Software Foundation is explicitly about growing the community and promoting adoption. Of course individual projects might not be focused on that goal and not everything falls under the PSF umbrella. But I understood it as an overarching motivation, and many decisions are discussed here with that perspective.
The PSF isn’t me, an individual volunteer contributor, and isn’t the core dev team either. It’s possible to have different missions.
For the most part they align, but you’re allowed to be a contributor with motivations that don’t directly match the PSF’s mission statement.
[Later] And to justify the PSF’s mission somewhat, “growing the community” is distinctly different from “growing the userbase”. The user base is millions larger than the “Python community”, and so the hope is that the PSF will work to connect more of those users in ways that grow the community through events and other approaches that require more organisation than most individuals can handle (e.g. try getting insurance to hold a public meeting).
“Promoting adoption” is a nice thing to have in there to prevent the PSF from taking actions that would discourage users from using Python, and to justify solving “unpopular” problems that nonetheless help users. For example, supporting contributors directly, infrastructure maintenance/costs, and more like that, as well as owning and defending the Python trademarks. These don’t look like “growing the community”, and so it could be argued that the PSF shouldn’t be doing them, but they are essential to the overall mission.
Nothing that I say here represents the PSF’s position on anything - you’ll have to look to their board to make statements (which don’t necessarily represent my position on anything). And things I say only represent the core devs up until they kick me out for misrepresenting them But I’m trying to be more general than that, and most of the motivations I’ve been talking about are intended to cover all the projects that are being thrown around as the One True Workflow Tool, and OSS contributors in general. Wanting to do OSS for “clout” is okay, but as a mentor I usually recommend against it, and in places where I get to be a gatekeeper I will more actively discourage it.
For sure, everyone contributes for their own reasons. Your earlier posts were referring to a “project” as a single entity, which I interpreted as the overall consensus position of a group of contributors.
Disclaimer: sorry if I was not clear enough, but I tried to be concise and avoid walls of text. Hope I did not cause any confusion or sounded rude - I did not mean any of that, and sorry if I let anyone think so.
I mostly agree with you: a “recommended tool” does not need to be shipped with Python. In general, though, pip ships with Python and this does not prevent its own release cycle - but I may be wrong on this one.
Not a killer feature of course, not one that pip needs. I reckon it’d be overall useful (you can always edit those by hand afterwards) but my suggestion was more of in the direction of “driving pyproject.toml adoption” and also making it more integrated in Python’s ecosystem.
My underlying assumption was: pyproject.toml aims to be what Cargo.toml is for Rust or project.toml is for Julia. This thread made me clear that - while this is partly true - this does not apply with pip or, at least, that there is no plan to make a “Cargo for Python” by the SC/PyPA nor to make pip more than what it already is. This does not exclude (eventually) recommending a tool or just a workflow. I reckon these PEPs (including 735) are going in this direction.
Definitely, what I wrote is partly tangent and I did not want to get overly verbose trying to explain the connection. I think adoption might be/will be/is driven by some niceties that the plain text format allows. I think being able to add optional and dev groups with tools is a great addition. My point on the “monolithiv venv” was more to suggest that modern tools like Poetry PDM and Hatch do what I previously said and are designed in such a way to make it really tough to persist with this bad practice.
Here I conflated a couple of things. I stand by what was said in this thread: pip has a privileged position. Basically every blog post on Earth about a library tells you to pip install it. pip ships with Python. What I wanted to say is that once/if there is a recommended workflow or tool, pip should be an active channel to spread it (even if pip won’t be that tool).
I agree with you that modern package managers are indeed spreading the adoption of these PEPs, including PEP735 if it gets accepted (and I hope it will); and also drive new PEPs (see the new discussion about lockfiles).
This is true, but not in the way I suspect you mean. Pip has a privileged position because it is the only installer for Python packages that exists[1]. All of the other tools we’ve been discussing here (hatch, poetry, PDM, …) use pip to install packages behind the scenes.
So yes, pip is in the stdlib[2]. Not because it’s the best workflow tool, or because it’s the “recommended approach”, but purely because the core Python distribution needs to give users a way to install packages.
At this point, it would be a huge undertaking, as you said, to change all the documentation and all the courses that say your starting point is pip install <some tool>. That makes it even less likely that anyone will build a replacement for pip. But it’s possible. It’s just not what any of the workflow tools we’re talking about are.
ignoring conda, which has an independent ecosystem, and doesn’t fit in this discussion ↩︎
technically, ensurepip is in the stdlib, pip is a 3rd party package ↩︎
This is a surprising perspective to me! The mission of the Python
Software Foundation is explicitly about growing the community and
promoting adoption. Of course individual projects might not be
focused on that goal and not everything falls under the PSF
umbrella. But I understood it as an overarching motivation, and
many decisions are discussed here with that perspective.
Don’t mistake the Python Software Foundation for the CPython
project. They are connected, but just because the PSF has a mission
to grow and promote CPython, that doesn’t mean that the CPython
project contributors have the exact same goals and motivations.
Wouldn’t one way to do that be to put a large notice on the Python.org download page saying something like “we suggest you try conda”? I guess that’s more blunt than many people seem to want? It seems clear to me , though, that the conda ecosystem is the most mature alternative distibution channel that is “drop-in” across platforms[1]. So if the goal is to shunt people away from the Python.org installers, just a direct signpost to conda seems like a reasonable option.
as opposed to things like Linux distro packages that by design are targeting only one OS ↩︎
Look, I’m just trying to stop everyone obsessing about getting their preferred tool into the python.org distros. We’ll consider bringing in another tool when it’s proven that the ecosystem can’t function without it, or that the vast majority of our existing users are best served by including it.
Without a significant change in the sentiment of the core team, or an edict from the steering council (who usually base their decisions on the sentiment of the core team), nobody’s workflow tool gets to “win” by being in the upstream distros.
Outside of that, you don’t need core team approval to make a new distro with your tool, or to get your tool added to other distros. And if those distros become more popular than the upstream ones, that’s great!
I find this perspective intriguing, particularly considering the primary issue I’ve heard in the Python developer ecosystem (though this might be my own bubble) is the absence of a unified default packaging workflow.
I think the following exhausts my view point on the situation, so I shan’t reply any futher.
The first criterion involves attempting to prove a negative, so let’s set that aside for now. The second criterion, based on my experience, seems to have been true for quite some time.
Drawing from my involvement in the Pip codebase, addressing user queries on the Pip GitHub issues page, building and supporting Python communities within various companies, maintaining Python distributions for thousands of users, and supporting local public Python meetup groups, it appears to me that:
Few users comprehend that Pip is primarily for bootstrapping other tools
Many users are not aware that “pip install” only considers install-time dependencies and not their environment
Consequently, most users lack the knowledge of how to manage a Python environment using Pip
Users who employ more than one index (e.g., pypi.org, download.pytorch.org, private ones) often use them insecurely, exposing themselves to attack vectors because of their lack of knowledge of how Pip handles multiple indexes
If the ideal future solution is “use a different distribution than from python.org,” perhaps it would make sense to make that distribution more challenging to work with for inexperienced users. For instance, not producing any installers and only distributing the source code there, or not providing any self bootstrapping utilities such as bundling Pip or having ensurepip in the standard library. This would undoubtedly incentivize many more actors to distribute independently of the PSF.
The recommendation is that folks choose a Python distribution that is most appropriate for their use case. The python.org builds serve to set a minimal baseline of expected capabilities rather than attempting to be all things to all people.
The workflows that are appropriate for writing ad hoc personal and system management scripts differ from those for writing utility libraries for publication or web applications with a potentially global footprint, hence it being more appropriate for the related tools to be chosen by users rather than included by default.
I didn’t know that! So to make sure I’m not misunderstanding, poetry includes things like a PEP 517 sdist builder (I knew it had its own resolver, although I wasn’t clear on when that was used)?
My point does stand, though, in that if Poetry can’t function without pip being installed (I’m not sure if I can assume that “non-default” means “everything works even if pip isn’t available”) then pip remains the only candidate to be the bootstrap installer in the stdlib.
I’m excluding rip because it’s written in Rust and so is not really a practical candidate for stdlib inclusion[1].
Regardless, it’s good to see that there’s at least some competition in the “Python package installer” space. Even if it’s a bit under-publicised at the moment.