Yup that discussion definitely got me rethinking a lot. Cargo’s custom build script seems to work pretty well and can be used as an inspiration. The script (analogous to our PEP 517 backend) is one Rust program that emits information to tell Cargo (PEP 517 frontend), including:
When should a rebuild be triggered
What the build script’s result contributes to the build environment (not sure what the analogy would be for Python)
How the frontend should do with the result (extra linker flags; the analogy would be what files the frontend should copy)
Why an “escape hatch”? Why not just keep the historical behaviour of pip install -e .? It seems there is ideological opposition against it and very little concern for practical issues, as outlined by @hynek and I above.
+1 with that. It reinforces my feeling that editable installs can be delegated entirely to backends.
With such an approach the scope of standardization of editable installs could be reduced:
so that frontends can be aware that an editable install occurred (for instance, today pip is totally unaware that a distribution was installed with, say, flit install --symlink and therefore pip list and pip freeze can’t output meaningful information)
so tools such as tox can invoke editable installs in a uniform way, to be able to provide test stack traces that point to the original source code
First, the historical behavior involves setuptools ignoring the frontend (pip) and just YOLO’ing over the user’s environment, so it’s really not obvious how to generalize it to support other build backends in a consistent way. Even if we keep the backend doing most of the work, we still have to figure out how the frontend tells the backend where to stick the files (which needs to respect the user’s configuration the passed to the build frontend), and we need to figure out how to support uninstalls. Ideally in a way that consistent and generic enough to allow new tools and features to be invented in the future.
Second, the historical semantics have some very sharp edges that can easily bite users. Everyone agrees that the functionality is super useful, but it would be even nicer if we could keep that functionality without requiring every user to understand all the intricacies of which edits require which kinds of rebuilds and track them in their head. That’s an impossible task for a beginner, and even for experts it’s a waste of mental energy that could be spent on more productive things. Tracking this kind of thing is what build systems are for.
Like, yeah, cython dependencies are complicated. But surely it’s still easier to teach a computer how to figure them out than it is to teach every user how to figure them out in their head. And more reliable too. It would be nice to at least have the option of using smart build systems in the future.
And yeah, in the mean time there’s nothing stopping individual backends from offering setuptools-style editable installs. Flit supports them today. So it’s not like this is an emergency that has to be solved yesterday.
To reiterate the point here, even if pip install -e . doesn’t work for you, you can still use python setup.py develop, or flit install --symlink as appropriate. This will of course annoy the people who insist that everything should be available as a pip command, which leads us right back to @steve.dower’s comment earlier:
I broke out the “single tool” discussion as best as I could to Developing a single tool for building/developing projects. I can’t split a second time, so the editable install discussion can stay here or we can start a new discussion if we need to start writing down the exact requirements for an editable install that e.g. .pth files don’t solve.
Thanks for doing that Brett! I was definitely feeling that discussion was mixing in with the original conversation here.
In case someone’s wondering – I’ll be AWOL from this discussion for another week FWIW, because last week of college before end-sem, which means lots of submissions.
Not to necropost (also I’m not sure if it is best to just create new threads so Brett doesn’t have to split things off all the time), but as someone who works on a pretty popular command line tool, we are constantly asked to support pyproject.toml.
There is a lot of confusion around this as a tool author.
I posted here a while ago to clarify that tools can indeed use the tools section of pyproject.toml, but most people don’t know that. In addition, there is a lot of uncertainty about another file format and people don’t want to add support for that. If we want to get to a single tool, there must be a stage where there is some authority (wink wink) saying start using pyproject.toml, stop using everything else (i.e. deprecate setup.cfg).
On top of this, based on searching for issues about pyproject.toml adoption related to tools, it seems many are blocked on requiring a toml parser. Some projects do not want to add a dependency if they don’t have to, and having toml not in the stdlib is harming pyproject.toml adoption. I don’t know if this was considered in the original PEP discussion, but it is somewhat problematic. (esp since toml is probably going to take another 6 months at least to hit 1.0, though maybe @pradyunsg knows better .
As a tool author I should not need to think about other parts of the file.
I think there needs to be clear leadership and engagement about deciding what direction to go wrt. pyproject.toml as the recommended tool configuration file, otherwise these projects are going to spin a lot of cycles debating whether or not to implement it and generate more confusion.
I know your point is general, but what tool is that? I’m interested for context, in terms of how widely the pressure to support pyproject.toml is extending. When PEP 518 was written, we imagined that other build tools would adopt it, and we considered that it might be adopted by development tools, but that’s as far as we expected things to go.
I don’t think the PyPA really has the authority to decide on general tool configuration questions. This is something that would need consensus from tool authors in general, and most of those are not PyPA members, nor do they follow packaging discussions.
If you’re looking for some sort of standard on whether to use pyproject.toml, then I think you’d need an (informational) PEP - and specifically, a Python PEP rather than a packaging PEP. That’s about the only way I can see to reach tool authors in general. That would also be a good way to open the debate on whether a Python standard should recommend a format that Python doesn’t have a stdlib module for (the trade-offs are different for packaging tools, and the reasoning for packaging tools is covered in PEP 18, but as you say it doesn’t apply the same to general tools).
Sure, but hasn’t it been only a year since pyproject.toml got flipped on for projects in pip and it broke a ton of stuff for those that had opted in? It’s not like having pyproject.toml work as expected is old hat for everyone. If as a group we can say we are comfortable with how pyproject.toml functions in the packaging world then we could talk about pushing projects to adopt it both for their build tools and as a place to store their configuration details, but at packaging speed this is all fairly new.
It was and we decided it was still worth going with TOML.
If it does then it does, but once TOML reaches 1.0 I am sure the discussion of getting a module into the stdlib will commence. But supporting a not-yet-finished standard in the stdlib is much more problematic IMO than some projects refusing to add a single dependency to potentially switch to using pyproject.toml 6 months to 1 year sooner than they would have.
It’s the precise problem that tools like Black started to ask projects to use pyproject.toml that we made the build-system table optional as people were complaining that tools wanted them to add pyproject.toml but that they didn’t want to port their build tool as well. So it was either risk stalling out pyproject.toml adoption by saying “it’s all or nothing” and people choosing “nothing”, or let it potentially become a project configuration file organically. We obviously chose the latter. The world is messy.
If you have a specific proposal of what you think should change then that can obviously be discussed.
Go ahead and adopt it. It’s there and it’s not about to be deprecated or ignored by tools.
And even then I think it’s out of scope for a PEP. We don’t have a PEP on how to lay out your projects and various other things that are common to projects. It’s a slippery slope when the language starts to dictate how projects should structure themselves. Now if there was the PyTT (Python Tools Troop) or something then I would say they should put out a recommendation, but having the language do it doesn’t seem quite right to me.
It is up to the tool to decide how to handle [a pyproject.toml file without the build-system table]. Potential options are:
Act as if the pyproject.toml file does not exist.
Assume the table exists with the default values specified above.
In practice everyone would simply follow pip’s behaviour (1. if the table is missing entirely; 2. if the table exists but misses values); why the choice instead of specify that?
Responding to the mention but haven’t caught up on the rest of this thread yet – whenever I find ~1 week to work on this, I’d be able to get stuff done for this release.
Hopefully, now that I don’t have classes to attend at college, I’ll have a bit more time before my “find a job” phase begins.
I honestly don’t remember, but I’m sure there was some reason. I have no objection to changing the PEP to drop the second option if someone wants to start a new thread to discuss it to surface it a bit more.
If I recall correctly, the only reason we made the handling of missing build-system settings implementation defined is because it was a post-publication edit, and it’s the default convention in standards development to soften such additions to recommendations rather than requirements.
In this particular case, we’re not actually aware of any tools that do anything different though, so making it a requirement should reduce long term confusion.
While that is true, I would argue it should have authority to decide on build time tool configuration, which is really what is of concern here.
Fair.
Don’t get me wrong, I love the choice of toml! I’m just communicating that this choice was painful for many users, which I hope we can agree is something to be avoided
I propose the tool should either “Act as if the pyproject.toml file does not exist (with some deprecation notice)” or “Error that the section is required” and at some future date kill the first option.
I think you are missing the point. As discussed earlier in the thread, multiple competing options with no clear default lead to confusion and needless debate.
I strongly believe that how a project lays out its configuration should be standardized in some way (though it doesn’t have to be at the language level). I think this is the second best decision Rust made after the borrow checker in fact, and a large reason why cargo is such a delight to use. If we look at npm, there is a clear definition of what keys and such should exist in project.json. If pip requires something and has a documented default, that is as close to an unofficial standard as you can get.
Things don’t need to be PEPs to have people use them. If something is pypa recommended, and that tools makes certain requirements, a lot of people will use it (if not everyone). Just look at pipenv’s popularity as an example.
Wow that is excellent! Thank you so much for working on this, I’m sure many people will be happy to see 1.0 for toml.
Except Cargo.toml is not the standard all-in-one project configuration like you say. I’d feel it’s reasonable to compare Mypy to Clippy, which uses its own configuration file clippy.toml, not Cargo.toml.
The stance of PyPA is quite clearly to project certain control over that file. How pip implemented PEP 517 also signals that it expects packaging implications from the file’s presence. And I feel that is a reasonable resolution (although maybe PyPA needs to be even more explicitly sending this message).
You also have to realize us authors of the PEP have been raked across the coals for this decision and it still comes up and a semi-regular basis (other than this thread I was just asked about this last week on Twitter and a bit of talkback when they didn’t totally agree with our reasoning), so for personal sanity I try to avoid this specific topic.
If that’s the exact change you want then please start a new thread to discuss this specific change, else this is liable to get lost in this topic as it’s already spread across two different topics (I would do it myself by splitting this thread but other discussions of responsibility and such are already intertwined and so it wouldn’t be a clean split).
Ah, but see that’s the trick: who is supposed to standardize it in the Python community? I mean how much of a statement are you after beyond the fact that PEP 517 and 518 exist to show that the direction the PyPA is proposing going forward for packaging, and thus if people really hate their file counts then they can come along for the ride and use pyproject.toml as well? No one ever made a statement about setup.py or setup.cfg being a standard and yet people have been talking to me over the past month as if they are standards simply because of their wide use.
For me, I think to get uptake for pyproject.toml is to make sure the key projects are supporting it appropriately and then get people talking about it publicly in such a way that they view it as a net positive and just some busy work to simply switch file formats.
First, I appreciate what we have. Python is great. The people who make it are great too. I mean, I get to write code and anyone in the world can use it. It’s amazing, and I thank all of you working on this.
When I first learned about packaging (some years ago), I remember the journey from Python package to PyPI library did take a while to grasp. It was not hard, but there were some hoops to jump through. Years later, I still have to review my notes. In other words, distributed packaging doesn’t fit in my head like most other Python paradigms, so the most general pain point I can recall might be the workflow, the files involved and why they are needed i.e. setup.cfg, Manifest.ini.
In the end, I’d like to:
make a package
distribute it on PyPI or .exe
have any end-user run it via terminal, or (the dream) one-click solution
Good news, we are mostly there, save the last item. I’m hopeful the way we redesign distributed packaging now will help get all three in future Python versions.