FWIW, it’s been a case of not-enough-hours-in-the-day for me to work on pradyunsg/installer. I’ve been stretched a little too thin lately, so installer has fallen by the way side.
If someone is willing to spend the time, I’m open to co-working on this over coming weekends – https://calendly.com/pradyunsg/weekend-hour. This is especially open to Linux distro maintainers, since, well, they’re the most invested in this.
No worries, I really appreciate what you are doing. This thread is about the current state of affairs and not to put pressure into this moving forward, as the resolution requires volunteer time.
I can sign up, I would just ask that you write down the topics we need to address so that we can more easily organize, keep track of progress and what is missing. Doesn’t need to be the full list, just the current working pool and maybe things for the near future.
I think it does. Spec implementation is hard, and it is very easy to get tunnel-vision and miss details when everyone is looking at the same thing. Parallel development with good communication (“comparing notes”) is a good way to make sure edge cases are all covered, in my experience.
Also, although it is (relatively) clear what specifications are in scope of a wheel installer, the interface to access them is still in no way obvious (well, everyone agrees they know what the best interface is ). Friendly competition is a good way to drive UX innovation.
Strong +1 on this. The main reason we created PEP 517 was to allow multiple implementations. We now have a number of build backends. Having multiple frontends is also hugely important to the health of the community - as @FFY00 has already noted by pointing out that pip isn’t the right fit for distributions. Having multiple wheel installers, far from being “duplication of effort”, is actually the whole intention behind the packaging standards approach, and I’m disappointed if people have missed that message and are still hoping for “one true tool”.
-1 on this. PEP 517 has been around for 5 years now, and I don’t think we should need to treat it as a “second class citizen” by this point. I’m not saying that we should actively push people away from setuptools, but I see no reason to stop any project from using an alternative backend if it suits their needs. If Linux distributions still aren’t in a position to handle that, then I have sympathy, but I’m not inclined to hold the ecosystem back until the distros address that problem. Distros can ask individual packages, if they so wish, but I don’t think the “Python packaging community” or the PyPA should add their weight to such a message.
The one personal note about this: we should make it fairly clear what the relationship between the various projects is, if we go down this road, because I’d like to not deal with folks-who-do-not-wait-to-criticise coming in and saying demotivating things on either effort.
Opens the space for other build backends than setuptools. The new build backends (such as poetry/flit) can offer a simplified improved UX (e.g. simple TOML configuration file rather than executed python code). New build backends are not obliged to offer feature parity with setuptools (importantly such as building c-extensions). Therefore they can simplify the packaging experience for the majority of users working on just pure python codebase. You can read about this more anecdotally within my blog post at https://www.bernat.tech/pep-517-518/
I don’t think it’s stalled. https://github.com/pfmoore/editables I think is a direction we agreed would be great to go down. However it’s waiting for people willing to integrate that with setuptools/flit as POC, and then we can move to standardise it. I’m currently already working on the build tools and the tox rewrite, so until those roll out I’m sadly not available to work on this.
I was gonna write a slightly angry reply to this but I will refrain from doing so. I tried summarizing my points. Please note that I am extremely disappointed with this response, I have been trying my best to fix the issue and this kind of inflexibility to compromise in any way is very demotivating.
PEP 517 has been around for 5, yes, however it is still provisional and the period it has been actually usable has been far smaller.
I don’t think it’s reasonable to expect distribution packagers to be actively monitoring PEPs that affect the packaging ecosystem or to follow this high-volume discourse. IMO it should be the PyPA’s job to at least make the minimum effort to understand how proposed changes will affect packaging users, not limited to pip. As it stands, PyPA’s packaging interest == pip, which I believe is very wrong. Perhaps a distributors mailing list could be created, but that still requires some movement from PyPA to try to get feedback, since PyPA members don’t seem to be thinking about these use-cases.
This comment in specific is hurtful to me as it is not what I am proposing at all.
How is asking to keep using setuptools, which fully supports PEP 517 (not that was even needed since PEP 517 is backward compatible), on less than a handful of core packaging packages, like pypa/packaging, holding the ecosystem back??
I have even outlined an alternative. If you really need another backend, please choose one that will include a setup.py on the sdist.
Again, please let me know in which way this is holding back the ecosystem?
And please let me know how do you suggest distributions should fix the problem? Because to me the only clear solution is to stop updating packages, which would just end up with more bugs being opened against pip and all other packages. That is something I really don’t want to happen, but maybe I just shouldn’t care as that is the same courtesy I have been given.
In which case, I apologise. But I’ve clearly missed what you are proposing - all I can say is that I did point out earlier that I was unclear what you were suggesting.
I thought you were suggesting that we encourage packages in general to stick to setuptools. Re-reading your original post, I see I completely missed the point of that section of what you were saying. I skipped down to the tldr section which didn’t mention it. My bad.
But the PyPA governance PEP explicitly states that setting policies on individual policies is a specific non-goal. So I think you need to ask the individual projects. Speaking for pip, we don’t have any plans to move off setuptools, for what it’s worth. And if any other pip developer proposes we move to flit, I’ll link to this discussion so the information is available.
I thought flit included a setup.py. It looks like maybe newer versions have stopped doing that. So basically you’re asking for a set of 5 packages to avoid using flit? As I say, that’s a much more limited scope than I realised.
That was my misunderstanding, so accept my apologies once more.
I don’t honestly know. Maybe discuss the use of flit with the projects that are using it? I can’t say how they would react, but that would be where I’d suggest starting. If they don’t want to change, I think you have to respect that - the PyPA is not the right group to go to for support in pressing for change.
Apart from that, working on (or helping with) a wheel installer sounds like a way forward, or developing ways of automating a focused manual unzip process for wheels for the projects you’re concerned about.
Honestly, I don’t know - I sympathise with your problem but don’t have any solutions for you.
I’m sorry if my misunderstanding, and my response based on that misunderstanding, upset and annoyed you. That wasn’t my intention.
(I’ll note that as the maintainer of build, you’re as much part of the PyPA as me, so I hope that you understand here that I’m not commenting from any position of authority, but simply as a colleague looking at the concerns you’re bringing to the table).
I feel that the PyPA does try to understand the user’s perspective. Personally, I’m focused mainly on people wanting to create packages, and people wanting to install packages. I don’t have much insight into how distributions differ from other people wanting to install packages, so to that extent your perspective as a PyPA member is very valuable. But I’d also note that there are many more package builders and installers than there are distribution maintainers, so I’d personally be cautious about giving undue weight to distro maintainers in the (rare) cases where there’s a conflict.
I definitely don’t see the PyPA’s focus as being on pip. Far from it - even as a pip maintainer, I’m extremely interested in making sure that the PyPA pushes to set up standards that allow people develop their own tools, and not be locked into pip as the “one privileged tool”. (I could take offense at the fact that you think otherwise, because for me, interoperability is all about breaking down barriers to competition - but I won’t, because I’m well aware that you’ve no reason to know about the relevant discussions I’ve been involved in).
Hopefully, I’m explaining my position better this time. Regardless, though, I’ve said enough, and I’ll leave it here - I don’t want to unduly influence the discussion, so I’ll leave the floor open for others to give their views.
It’s specifically around the burden of using setuptools compared to flit. The maintenance cost is unfortunately non-zero in maintaining setuptools support in such a simple project. So it’s not just the more pyproject.toml part, it’s a workflow thing as well.
So it sounds like finishing installer unblocks people. OK, so if someone commits to help out @pradyunsg to aim to get installer done and in PyPA by the end of 2021, I will support reverting the flit usage in packaging to give folks the time they need to move things forward.
Does that seem fair to everyone: to unblock folks today to make sure no one is blocked in 2022?
This. I would like to have a test that shows that it is still possible to install entirely from source (that is, from source in a vcs, not pre-made sdist).
In Nixpkgs we currently use pip but I’ve been wanting to switch to build. Due to its dependencies I haven’t made the change yet, but I should at some point.
When that’s done I am willing to help setting up a CI with Nix that checks whether bootstrapping from source remains possible.
Just now I gave it a try to see whether I could bootstrap build. Using our bootstrapped-pip (essentially unpacked pip, setuptools and wheel archives) it works fine, but there is less point in using build until installer is available. Right now I made the assumption every package has its package name as folder name so they can easily be added to PYTHONPATH. I know some projects like calling their project src. It would be convenient if this would not happen with the packages required for bootstrapping.
I’m ok with asking to keep to setuptools as build backend. This here though makes for these projects hard to test the site-packages, not the local source tree. As such I don’t think this request is reasonable. Project don’t do this because they like, but because this solves an active problem during development and testing.
None of the projects is using src currently, so apparently it doesn’t seem to be that big of a deal just yet. Or did I give the maintainers an idea?
FIne, not being able to use the package name as folder directory would mean explicitly listing the packages you need along with their corresponding module names in order to make a PYTHONPATH. Unfortunate, but really not that big of a deal. I suppose most distro’s do that already anyway.
The PyPA is a community and the distro maintainers can be a part of it, or at least that is what I imagine it to be. In that sense, are you not also a part of it?
This I unfortunately agree with. Following python-dev, Packaging/ here and interacting in some of the discussions as well as on the issue trackers of some of the projects, it is also my experience that the use cases and issues of distro’s are just not considered well.
Distro’s have issues like bootstrapping, they deal with large amounts of packages so they want or need to have a structured approach for fetching package information, and building and testing these packages. We need to resolve dependencies of a large set of packages when updating our package set, and be able to override requirements when solving, because there’s always packages out there that pin too strictly.
I’ll request everyone engaging in this discussion to avoid making broad strokes arguments/statements about state of Linux distros + Python packaging and who’s responsible for what — please go make dedicated threads for those discussions. Saying those things here is inevitably going to derail constructive discussion here, and I really don’t want that happening.
I’d like to not have this thread become about people complaining/debating about how status quo isn’t ideal (look around, nothing about the world is ideal at the moment). Let’s keep this discussion focused on the very specific suggestion in OP to resolve one specific issue.
Wanna have broader picture discussions / talk about responsibilities / anything that not 100% about the OP suggestion? Please make a new topic.
My suggestion then for all the Linux distro folks is to organize and create a mailing list where you can coordinate what needs and issues you have to surface them to the appropriate group as a collective rather than as individual “Nix”, “Arch”, “Debian/Ubuntu”, or “Fedora/RH” asks. I’m sure the ML could be hosted on mail.python.org if you asked.
No worries. I can see that the post is written in such a way that you would probably need to read it consecutively to understand what is being proposed.
I consider this a large ecosystem-wide change.
the PyPA should only concerned with large, ecosystem-wide changes
The goal is to make bootstraping a packaging environment a reasonable task.
Then perhaps it should change its name, as the “Packaging Authority” is exactly whom I would contact asking for support in a change like this.
I think using “Authority” in the name, makes users feel like the PyPA is something they should be listening to follow guidelines from. As it stands, it appears to be just a group of projects. I do not believe this is correct as you can advertise to users to move to PEP 517 but when the ecosystem workflow breaks you get to say “it’s not our problem”.
I get that, but you must understand that is not the current status quo. Well, it’s something that is definitely changing, and I am very thankful for that, but it has not fully been achieved yet.
Pushing standards is a very big thing, but another big thing to me is interoperability of code. And this specifically is where I feel pip is privileged, why isn’t the pip install code a library that pip just uses? Or the build code? In pypa/build we had to essentially reimplement the same thing pip already does, the build isolation. Why isn’t the pip resolver a separate library? It just resolves a set of PEP 508 dependency strings, no?
Why aren’t these things that exists? Like pypa/pep517 or pypa/packaging.
We are in a state where the PyPA is pushing change, but mostly focus on implementing those changes for their tooling. While it’s great that there is a standard that allows other people to develop their own tools, by pushing change like this, the PyPA is putting a gigantic extra burden on other users of the ecosystem by forcing them to create their own tooling. Other users might not be very Python focused, some distributors might not even have access to someone well versed in Python, yet their are now forced to write their own tooling from scratch.
Do you understand this? I feel that most people here do not really understand what these ecosystem changes mean for other people. Especially when they performed in the way they are.
What I am asking here in this thread is to please let people catch up before start making change that rely on the new ecosystem changes.
This is also where I feel there is a disconnect between the PyPA and other Python package distributors. There is a point in specific which I think would bridge most of the differences if the PyPA started caring about it:
Bootstrapping a packaging environment from scratch (no vendoring)
I do understand that.
I think I can commit to that.
No, that solves my problem but leaves out people, which is specifically what I am asking the PyPA not to do.
We should have a ML for Python distributors, not for Linux distibutions, these issues are relevant for all distributors, not just Linux ones. There was already a discussion about this in Python-Dev, in the tzdata dependency thread, perhaps we should just go ahead and create the ML.
Quite simply, because that’s not how pip was written. We’re slowly changing that, by defining standards, writing libraries (like packaging) that tools can use to follow those standards, and using those libraries in pip. Of course, that means that bootstrapping pip is a lot harder, so we have to vendor a whole bunch of stuff. That works fine on Windows, but conflicts with Linux distro policies. We work with the distros on devendoring, but we don’t maintain a devendored version of pip ourselves.
And I expect pip to vendor that code as soon as we can, so we’re not replicating it, but we use the same code as everyone else will. Of course, if pypa/build hasn’t implemented the mechanism in a reusable manner either (I think I recall hearing that the intention was to do so, but I may be wrong) we need to wait for someone else to do that so we can both depend on the library version…
It is - resolvelib. Again, pip just vendors it. There’s a lot of machinery around that library that’s still pip-sepcific, but that’s just because no-one has had the time to extract it into a library. (I’m dabbling in making pip’s finder into a library, but haven’t got very far, as it’s either trivial or incredibly complex, depending on where you draw the boundary… )
Mostly lack of developer resource, as usual.
Yes - sort of. I understand that’s how people view the PyPA. But I’m painfully conscious that the reality is very different - whether we or they would like it not to be. I wish we’d never (jokingly) called ourselves an “authority”, and I don’t know how we address this disconnect. But I’m fine with other PyPA members working that out, and I’ll go along with whatever works.
That’s entirely distinct from my role as interoperability standards PEP-delegate, where I have a very strong view that we need to standardise as much as possible, and encourage the growth of reusable libraries and competing tools as much as we can. So I want to see libraries like packaging, resolvelib, importlib.metadata, etc flourish. I want to see more build backends, all competing on an even playing field. I want to see competitors to pip. I want to see alternative approaches that don’t require bundling a bunch of stuff in pip - but not at the cost of making build backends other than setuptools into “second class citizens”.
I’m extremely aware that we’re a long way from that goal, and that all the projects and groups involved are extremely limited in resources, so progress is slow. But we need to encourage progress even so, and not stagnate just because it’s too hard to get people to look at the future.
Sorry - I’ll get off my soapbox now.
This is also where I feel there is a disconnect between the PyPA and other Python package distributors.
There are disconnects all over the place. It’s almost impossible to get feedback from end users. We’re continually struggling with the problem that we give the impression that all we care about is packaging specialists - because we can’t work out how to find out what packaging users want. I don’t dismiss distributions - it’s just that I see end users as a far more difficult, and yet far more important problem, and I prefer to focus on that. Others may have different priorities, and I’m fine with that.
perhaps we should just go ahead and create the ML.
That sounds like a good idea. But someone specific needs to do that, so someone needs to commit to finding whoever can create a mailing list and make it happen. I’m not criticising anyone, but it’s awfully easy to end up in a situation where everyone is vaguely agreeing that something should be done, but no-one is doing it. (To be clear, I personally have no idea how to get a mailing list created, so I’m not going to be doing anything to make this happen).