You undersell yourself here You are way more informed than the vast majority of users, including those who came in through a slightly different path, landed in Conda-land, and never saw a need to look further (for exactly the same reasons you gave for looking no further than PyPI).
The reason not to recommend PyPI-only is, indeed, “specialist needs”. But that’s the same reason to not recommend anything “only”. Specialist needs are why Conda users also use pip, and why Debian users also install from source, and so on.
And what we’ve seen from surveys and feedback is that most people don’t think they have specialist needs. So you can’t ask them and then suggest a path forward, because they can’t tell you. They just think that the tool is broken because it doesn’t do whatever perfectly normal thing they need (e.g. installing a database driver, or replicating the environment on deployment, or bypassing IT’s enterprise grade proxy ).
Our discussions right now seem to be threading between two sets of specialised needs that are not well served today:
packages with unspoken/unlabelled ABI requirements
packages with insufficient/outdated metadata
(Hmm… when I spell them like that, they seem kind of similar )
As with all specialised needs, nobody thinks they need them until it turns out they need them. Nobody could have told you at the start that they were going to need them. And now that they know, some are going to be frustrated that we didn’t tell them up front, or that we aren’t doing anything about them now.
I don’t buy the equivalence of that commitment. People have many reasons for choosing an OS, or rather not changing it, because the amount of things they’d have to change is so large.
But that doesn’t make them (in my view) “committed” to being bound to the OS’ infrastructure packages, especially if something else can provide them with a high enough degree of isolation that it works and doesn’t threaten the system’s stability.
The MacOS footnote I had illustrates this well IMO - people on EOL MacOS still want to be able to install the latest and greatest packages, but haven’t even upgraded past the OS version that prevents them from doing so in many cases. Same goes for people on corporate RHEL installs.
which is admittedly nontrivial, but not impossible. ↩︎
Sure, and this is exactly what Conda and Nix (and probably others) do. They bring all of the key infrastructure packages with them and isolate them from the OS. Implicit in my response was the assumption that the user is using “whatever Python was preinstalled” and they refuse to change, which has been an explicit statement a few times in this thread.
If the user is willing to change to a different install of Python from a different source, then they can be bound to the infrastructure packages used by that one instead. But you can’t get out of having to use the same infrastructure throughout your stack, no matter where you get it from (unless you build it all yourself, which makes you the distributor, and you can update as frequently as your distributor wants to )
You misunderstood my point. Someone who knows they have specialist needs has already got enough knowledge to be able to deal with the choices involved. The question I’m asking is what’s unreasonable about recommending PyPI/wheels to people who don’t know or believe they need anything special? Apart from anything else, they then won’t be confused by the vast amount of documentation that already exists assuming you’re using that option.
I reiterate, for clarity - I’m only talking about cases where PyPI/wheels is a good basic option - specifically on Windows, because that’s the platform I know about.
So why not document (as best we can) the boundaries of what the standard PyPI/wheel ecosystem offers, and then present it as the “standard solution”, unifying on that. We can’t do anything about people not knowing what they might need in the future, but we can at least warn them of the limits so they know when they are approaching them. As you say, everything fails to meet some level of specialised need, so “it’s not perfect” isn’t a compelling argument here.
The topic here is tools, not the underlying PyPI/wheel infrastructure. Maybe there’s value in debating how we can integrate other package distribution ecosystems with the PyPI/wheel system - but it should be a separate discussion IMO.
I’m not sure we can read the minds of the survey respondents like that. What I hear from the comments is that “everything should be replaced with something simple & unified” – I’m not saying that’s realistic, but I strongly doubt that many users of that mindset care particularly about the infrastructure or binary formats behind their UX, much less about the existing ones.
(I help to maintain setuptools, but I don’t speak for the project, these are my personal opinions)
The following is a comment on how I think we should do an unification
If we ever unify the tools for Python packaging, I think it is important for this hypothetical tool to be able handle legacy:
It is not fair with users to say: “now we have this single tool that everyone is supposed to use” but if they need an older package, expect them to ignore all the most recent documentation that can be found on the topic and go figure out how to maintain something different.
It is not fair with any tool developer to discourage the use of a tool but at the same time place the burden of maintenance of the ecosystem in their shoulders.
This is not easy and requires a lot of work. Of course there is also another way: declare everything that is not compatible with this hypothetical tool unsupported, as mentioned by Paul:
The following is not an argument against or in favour of the proposal, just me expressing feelings and thoughts that have been puzzling me. Hindsight bias, I know, but it is not easy to ignore…
It is important to recognize that this problem of “too many tools” (if indeed this is really a problem), is partially a problem of our own making. Years ago there were some “de-facto” standards in the ecosystem, and the packaging community invested a lot of effort to create standards (with a more or less declared goal of removing the monopoly of such tools). My opinion is that it was a noble goal, it incentivised the openness, and created opportunities to handle niche requirements or to experiment and try new things.
Going back to “there is only one blessed tool” feels like a throwback… If this was the goal since the beginning, the community could have saved time/money/energy if everyone worked together to fix/improve/rebuild existing tools instead of splitting the efforts.
Is it respectful of the work people have put in? (I don’t speak only of setuptools, but in general). The maintainers have put love and hours of work trying to make the ecosystem better by creating and complying with interoperability PEPs… If they had knew since the beginning that eventually PSF would endorse only one tool and “would not recommend X”, would they invest the same amount of love/effort? Was it worthy to comply with interoperability PEPs? Would it have been better instead if we all had worked together towards making a new tool and to move all the packages to this tool?
Moreover, is it respectful of the work the users had to put in to adapt? This process of standardisation was the source of a lot of “growing pains” that was imposed to the community because at that point in time it was deemed necessary that no single tool had preferential treatment in the Python packaging ecosystem. But now we are talking about giving a single tool preferential treatment…
My experience is more along the lines of Steve’s point here. A great many Python users do not realise that their own usage of Python (and “third party” packages) might be considered to be “specialist” or “niche” by someone else. I speak to many people who wouldn’t dream of using a language that didn’t have say multidimensional arrays so the idea that someone could use Python without NumPy would seem extremely strange to them. You really can’t expect all people to understand that “Python” is used for a wide array of things that are not related to their own use cases and that that is indirectly why people have wildly different ways of setting up and installing things. You might be imagining that people doing AI/ML must be more proficient in the basics of programming and software engineering than people using Python for other things and would therefore have a clearer understanding of the Venn diagram of Python ecosystems but that’s absolutely not the case.
To my mind, it is not a question of lack of respect for the respective maintainers, who’ve done a fantastic job in very challenging conditions. It may be my biased view, but in view of the scope of the problems to solve, as well as the lack of deeper language integration of packaging, the interoperability PEPs were the only halfway realistic path forward – no single project could reasonably hope to take on the responsibility of serving the entirety of the Python ecosystem by itself (without systematic support, i.e. language commitment).
One the one hand, having competing solutions is great for innovation, but horrible for duplication of work. And as the survey shows, users don’t exactly appreciate that decentralized and fragmented approach. We may yet get to have our cake and eat it too, if indeed we manage to hide all those different tools behind a unified interface, and I think it would be a large improvement, even though I doubt we can avoid those interfaces leaking implementation-details of the backends quite heavily.
In any case, if there were a drive towards a more centralised solution, I certainly would not see this as disrespectful towards those who have gotten us as far as we are now. I get the emotional investment in something one has spent a long time working on, but ideally, we should be able to uncouple the design decisions going forward from previous efforts (especially if we can agree to remove/lift/change some constraints that all-but-forced certain decisions at the time).
huge amount of responsibility for a thankless task that makes people scream loudly if anything breaks ↩︎
and I certainly won’t claim that I don’t occasionally fall prey to that as well ↩︎
talking generally, not alluding to any specific one here ↩︎
Thanks @abravalheri for expressing that point of view. I have similar feelings around respecting maintainers time - both what they’ve done in the past, and what we may be asking of them in the future. And I think the standardization of metadata in pyproject.toml and of build interfaces (PEP 517 & co) is one of the success stories of Python packaging. No need to turn back on that one and aim for unification of build tools imho.
Overall I think we still are trying to figure out what is feasible and a good idea to unify, or not. The message from users is that the current state of things is still not great, and to please unify something - but it’s very much unclear what that something is.
@pradyunsg asked that question pretty explicitly. I’ll repeat it here, with my answers:
Unification of PyPI/conda models → NO
Unification of the consumer-facing tooling → NO
Unification of the publisher-facing tooling → NO
Unification of the workflow setups/tooling → YES
Unification/Consistency in the deployment processes → NO
Unification/Consistency in “Python” installation/management experience → NO
Unification of the interface of tools → YES (as much as possible)
It’d be great to see others’ answers to this.
Regarding some of the other topics in this thread, I think they come in because there’s a number of inter-related things here. Because if you say something should be unified, you should at least have some level of confidence that it’s a good idea to pursue that unification and that there are no hard blockers.
I wrote a blog post with a comprehensive possible future direction, however the content in there all follows from a few things: the what to unify Qs above, Steve’s question on system integrators, and the assumption that major breaking changes have to be avoided. I’d really like to get a better sense of whether others have a similar understanding at this very highest level.
Unification of the consumer-facing tooling → NO, with a caveat. I don’t think we should try to force maintainers to work on a single tool, but if competition between tools results in users choosing a clear winner, I think we should accept that.
Unification of the publisher-facing tooling → NO. I assume this means things like build backends.
Unification of the workflow setups/tooling → PARTIALLY. I very definitely don’t think that (like cargo) we should mandate that every time anyone uses Python, they should create a directory, containing a src subdirectory and a pyproject.toml. The workflow of writing a simple script (with dependencies) in a scratch directory full of “other stuff” is an entirely reasonable workflow that we should support. Having said that, I support unified workflows for the tasks of “write a Python package”, and “write a Python application” (althought I think the latter is something we’ve traditionally ignored in favour of “write a package with a script entry point”).
Unification/Consistency in the deployment processes → NO. Although I’m not 100% sure what this entails. It shouldn’t be user-facing, though, which is why I say “no”.
Unification/Consistency in “Python” installation/management experience → NO. Although I think we should accept that this is not under our control, and like it or not, the main Python website is where people go for advice on where to get Python from. So we should work with the guidance given there, not fight against it.
Unification of the interface of tools → YES (but see below).
I’m not sure I understand the difference between “consumer-facing tooling” and “workflow setups/tooling” though. For the purposes of the above, I’ve taken the former as meaning the actual software, and the latter as meaning the processes. So we can have hatch and PDM, but they should manage the same project layout, expect tests and documentation to be laid out in the same way, etc.
As regards “interface”, there are two aspects - low level details such as the names of options, configuration files, etc., and higher level concerns like what subcommands a tool supports. For example, I’d love to see a shared configuration mechanism, so that users can set their network details and preferred index once. And I’d like a common set of workflow commands (things like “run an interpreter in the project environment”, “run the tests”, “build the docs”, “build the project artifacts”, “publish the project”, …) But I don’t want this to be an excuse to argue endlessly over whether an option should be called --config or -C. And I definitely don’t want it to override questions of backward compatibility for individual tools (which should very much be the tool maintainer’s choice).
Regarding the other discussions in the thread, I support better integration with, and support for, other system distributors/integrators. But I strongly disagree with any suggestion that PyPI and wheels should no longer be considered the primary route by which (most) users get Python packages. Having said that, I think such support needs to be a two-way street, and if “other system integrators” want to be supported, they need to engage with the community as a whole and get involved with this process - otherwise, we should accept that what support and integration we provide will, of necessity, be limited (e.g., I don’t think we should try to write a “how to use apt to install Python packages” page in the Python packaging documentation, but we could link to a Debian “how to use apt for Python users” page if they provided a suitable link that was written for the same audience that we are addressing).
I also don’t think it’s at all clear from what I’ve heard of the survey results, what the users are asking for in terms of the above questions. And I think that’s a far more important question than what we think.
Maybe it’s a case of “worse is better”, but I strongly believe that without PyPI and wheels, Python would never have achieved the popularity it has. ↩︎
Although if the users are asking for (for example) “cargo for Python”, then my response is “great, I hope someone writes it for them”. Just because the users want it, doesn’t mean that’s where I’ll personally devote my open source (volunteer) time. ↩︎
Language integration has been mentioned a couple times but without explanation of what that would be.
I don’t see what people mean. The language has an import system, with sys.path supporting multiple sources, and the site module handling site-packages and user-packages locations. Then separate installer tools can look for distributions and install them in the right locations. What else could the language do?
I’m going to refrain from answering because every “no” makes my day job harder, but that doesn’t mean a “no” doesn’t make more sense for the community.
Same here since we are talking about programmers installing stuff to program. Is the difference, “I’m using Python code, but not writing any” (e.g. running some CLI app) versus “I’m writing Python code myself that requires installing something”?
And what does “workflow setups/tooling” mean? Would trying to standardize where environments are created, named, and stored make sense in this scenario (which I’ve been asking the community about over on Mastodon lately)? Is this standardizing on Nox and/or tox? Or is this more about src/ layout and how to specify your development dependencies separate from your installation dependencies?
I’m maintainer (and only user other than former colleagues inheriting my code) of snoap which admittedly is not a packaging tool but it does do packaging and deployment via Poetry and pip.
The main reason I devised snoap is to address this distinction in a way that was appropriate for my work environment at the time:
This thread has gone very deep into packaging with native and/or compiled dependencies. I think that’s totally valid because of you pull at any string in a Python environment, you’ll probably hit that stuff sooner or later. However, I’m not sure if the users who answered the survey were really thinking in those terms. I’d guess (and it is a guess, could be way out) that most respondents are not wrestling with those issues on a day to day basis. They are probably more bothered by nagging doubts like “if I use Hatch and my colleague uses Poetry, can we collaborate on a project?” or “I want to share this useful utility that I wrote in Python with my non-technical colleagues, but it has external depencies and I’m dreading having to talk them through how to create a virtual environment” or “is it best practice to set a maximum version of dependencies in pyproject.toml or not” or “I really feel like I should know what setup.py is as I see it all over the place but I’ve never needed one”
This isn’t to say low-level packaging is not the root of some of these issues, but it would be useful to have a clearer breakdown of exactly what user experience is driving the desire for unification, what those users mean by unification, and how they think it would solve their problems.
（Maintainer of PDM here)
I’ve actually been following this discussion for a long time, but I’ve been hesitant to respond from what position and how to organize my language.
An interesting fact is, among the tools mentioned in the discussion, PDM and Poetry are the only ones that are not under the umbrella of PyPA, whose maintainers don’t show up here, and they are also the tools that offer the most functionality, which should be a good start of a unified package manager. Admittedly, Poetry’s package metadata format does not adhere to PEP 621(they will, hopefully), but PDM might be among the first few, if not one, that supports PEP 621, shortly after it is accepted. Although PDM began its life as the only package manager to support PEP 582, it is a bit frustrating that people continue to resist adopting it because of the incomplete draft PEP, even though PDM has switched to venv by default after 2.0. In my opinion, PyPA seems to favor single-purpose tools that do one thing well, rather than all-in-one tools. The most adopted package managers(or build backends) inside PyPA are flit and hatch.
So hereby let me promote PDM a little: it provides all the mentioned features here and has its CLI directly inspired by npm. I am also rewriting the build backend to provide a similar extensible interface to hatchling. However, PDM itself doesn’t adhere to any specific build backend and users can choose whatever they like. The next step is to add workspace support similar to how Cargo works.
This prompted me to think. I don’t want to focus on the Hatch vs PDM vs Poetry debate here, but let’s suppose for a moment that someone waves a magic wand and we get consensus on one tool. What would we then actually do to make that consensus effective?
I don’t actually have a good answer to this. But it feels like the user community expects the PyPA to have some sort of influence (or authority) over this, and our biggest issue is basically that we don’t…
vs any other tool with a similar scope that I’ve missed… ↩︎
Well, the sky’s the limit really, but off the top of my head:
enforcing metadata consistency / accuracy
using that metadata to do useful things in the python ecosystem
for example: regularly run CPython main against the ecosystem (compare Rust’s crater runs, or scala’s community build), identify regressions early, make them release blockers
solving the “I have to run arbitrary untrusted code” to install a package problem
preventing very confusing splits like venv vs. virtualenv
remove major installation footguns with the same urgency as other big UX problems (like pip not having a resolver for years and/or how easy it is to mess up your base python install)
use “happy path” defaults that solve the 90% case, leave expert mode to explicit opt-in
bring PyPA and SC much closer together
etc. etc. etc.
In the absence of all that, you get small groups of volunteers trying their hardest for their use cases / user base / niche, but anyone who’s exposed to several use cases / user bases / niches will run into the chasms between the various insular tools.
Some interoperability bridges have been built over time, but frankly, I don’t find it fair to put the responsibility for “Step 0” of installing/using/distributing something written in Python on a free-floating set of volunteers and hope they coalesce on a common strategy.
Yes, all of this is hard (especially since Python inherits a large part of the “lack of language integration” problem from C/C++ for code it wraps), and it’s not as sexy as a shiny new library or performance improvements, and everyone’s busy, and almost everyone’s working on this in their free time, etc. I’m aware there’s no panacea, but fixing these things should start with a commitment / plan / direction on the language level, because that’s the only place where common goals can really be set.
PS. For a particularly egregious illustration of the combined effects of bad metadata, arbitrary code execution & lack of tooling consistency (i.e. compiler flags), see this blog post.
I should qualify my answers in the light of this. As an end user, my answer would be “YES” to everything (with the usual proviso that I’d be unhappy if I didn’t personally like the chosen unification )
My answers were very much from the perspective of a packaging specialist knowing the trade-offs. And maybe that’s actually the wrong way of looking at this? I feel as though we’re resisting what is in fact a very clear message from the users, and the reason is that we simply don’t have a good anwer to @smm’s follow up question “how do we go about doing it”, so we’re chipping at the edges of what we feel we can manage - which from an end user perspective is disappointingly little.
Actually there’s no question - it is the wrong way. ↩︎
Methodology aside, that is an interesting blogpost, thanks! I don’t agree with many details (including some rather opinionated conclusions), but I think over all it’d be fair to say that it makes the same point as the survey respondents (“way too many tools!” / “unify or bust!”), quite forcefully.
It also discusses this very thread, plus there’s pretty active discussions about the article on Hacker News and Reddit, but – perhaps unsurprisingly – the tone there is less polite than on DPO, so maybe don’t click if this thread is already exhausting to you.
Still, if there’s something to be gleaned from the usual internet chaos, it provides a window into just how unhappy people are with python packaging. Which is not helpful for most people in this thread who poured their free time into making things better, but my take-away from reading those comments is that it underscores the point that average usersoverwhelmingly prefer homogeneity (even if enforced) compared to the the tooling diversity / freedom / innovation that has been the MO so far.
Here’s one of the more cogent examples:
as opposed to a more maintainer-heavy audience here ↩︎
I notice that that table lists “packaging C extensions” as being the missing (or partially missing) functionality. I’m not entirely sure what that means but no tool can “unify” the others if it does not support a major packaging use case because there will always be the need for some separate tool that does handle that case. Perhaps that could look more like pdm or poetry (or hatch) plus some backend extension or something though so at least a single tool could be used for all the other tasks.
There should be some consideration though of what people who are working with C extensions are supposed to do. It’s not just a case of maintainers packaging things for PyPI/conda but also the bigger group of people who would install from source.