OK, in which case we’re still in the confused situation of not knowing who’s responsible - the user says they expected to be able to create a new file and have it visible, the frontend says the backend didn’t say to expose that, the backend says that it’s the front end’s responsibility to do it, the front end says that it expects the backend to pass the directory back if that’s what’s wanted, etc.
So I guess we’re still no closer to knowing what “editable” is expected to mean. Ah, well. And the “virtual wheel” approach is still no closer to clarifying whose responsibility it is to say what gets exposed.
(A reminder - I’m fine with “we deliberately don’t specify” if that’s the position you want the virtual wheel PEP to take. I’m not trying to tell you how to write your PEP. I’m just explaining what I’ll be thinking about if I have to make a decision between two “competing” PEPs - although I hope it’s not being seen as a competition, we’re all looking for the best solution here )
I guess all we conclude that it’s possible but not in general and is not on by default. As a Windows user I’ve spent 5 minutes post install to enable it and using it ever since I’d say from my own POV was well worth the effort considering the benefits. So I think we should give this choice to the developers, to make symlinks a hard requirement to solve their use case. Not by default, but definitely an opt-in.
I don’t think that’s the case. I understood what @pganssle said to mean that the backend does not dictate what’s to be exposed - it still supplies the full set of files it would have included in a built wheel. In this setup, it will be necessary, however, for the backend to categorise the files, so that a frontend which wants to expose packages as a whole does not have to make assumptions about their nature. A rudimentary classification might include: package, module and package_resource.
Sorry, I thought I’d seen a suggestion that the backend might alternatively return a directory if it wanted to expose the whole directory, meaning the backend could take two different approaches and the frontend had to know that. But I can’t find that now, maybe it was someone else.
This would all be a lot easier if this level of detail was included in the PEP
I wish it was possible to do it with just ‘here’s the package directory containing all the code’. But sadly that’s incompatible with namespace packages (if I symlink foo from distribution 1, distribution2 can’t put anything under foo).
The vast majority of use cases I’m interested in look something like: 'src/foo represents the entire foo package, so symlink that directory as .../site-packages/foo'.
Namespace packages allow different projects to install in the same package, so foo.bar could come from one project and foo.baz from another. This is OK if foo itself is the namespace - you just make .../site-packages/foo and create symlinks in there. But if foo is a regular package containing your app and foo.plugins is a namespace package, then you have to symlink all the other files under src/foo, rather than symlinking the whole directory. It’s not impossible, just an extra complication necessitated by a feature I’ve never liked.
No, to be clear it is always the frontend’s responsibility to choose between a wider install and a narrower install. The backend just says, “Here’s where all the files are supposed to be mapped from and to” and my SHOULD language was that they shouldn’t be trying to control how it gets installed. If you want the behavior where all the directories are exposed or some variation on that, that’s up to the front-end (which could be configuration options in a single front-end or many different front-ends specializing in different kind of editable installs).
To be concrete, say we have a backend whose internal rule is "install everything in /a/b/myproj/src/myproj as myproj/. If src/myproj contains __init__.py and foo.py, it will pass the following to the front-end (simplified, not in the structure of Bernat’s PEP):
The front-end is free to use a mechanism that detects that adding /a/b/myproj/src/ to sys.path is sufficient to expose myproj and do that, even though there may be other stuff in /a/b/myproj/src/. It’s also free to do something like generate a .pth file that creates an import hook that detects any import myproj imports and tries to import the relevant symbols from /a/b/myproj/src/myproj/ (whether or not it’s in the mapping). Or of symlinking the directory $PYTHON_ROOT/site-packages/myproj to /a/b/myproj/src/. All valid modes of installation that different front-ends (or the same front-end with different configuration options) can choose from, with trade-offs evaluated by the end user.
It also has the option to symlink all the files and require re-installation to add new files, or to do something like generate a .pth file to install an import hook that bakes in the mapping, detects if a file would be in the project root and trigger a custom error if someone has added myproj/src/myproj/bar.py, like:
ImportError: The module myproj.bar was not installed, but it is present in
the editable install directory: you may need to re-install the package to
expose new files, and/or update your project manifest.
Lots of options here, but at the core the responsibilities are very clear — the front end decides how strictly the installed layout matches the mapping.
And as I mentioned in an earlier reply, if we feel that a mapping of just files is not sufficient for front-ends to use to build robust heuristics that allow the behavior that users want in common use cases, we can take on a bit more complexity and design an optional, limited “here’s where I look for new files to go in the package” mapping. In which case the back-end would expose something like this:
(or even something slightly more complicated that says, "Everything under this directory except things that start with tests/*). In that case it would still always be the front-end’s responsbility to decide what to expose (as long as it always exposes at least what’s specified in the mapping). The back-end isn’t required to provide hints, but it may, and the front-end isn’t required to use the hints, but if it does use them it can assume that they match the mapping.
Edit@layday’s suggestion of structuring the mapping may also work as a simpler version of “content expansion hints”. I’m not clear exactly how such a thing would work, but it would basically just be encoding information about the semantics that went into the mapping into the structure of the mapping, which I think is not a bad idea if done right.
Well, I doubt “setup.py stub” will be a supported mode forever, but my whole opposition to PEP 660 was that you don’t need implementations to support automatically detecting new or renamed sub-packages or files or whatever particular edge case you care about if you give this responsibility to the front-end, because you just need to have at least one front-end that supports your use case.
It seems that a lot of people seem to be arguing as if the question is whether editable installs should be able to detect new files or not, whereas I think it’s clear that end users disagree on this, and so the best thing to do is to aim for a situation where each individual (not each project) is deciding what heuristics and trade-offs to make in different situations. In my world, even if pip and every other major front-end decided to go with strict installations, you could write your own custom single-purpose tool that just does the looser install you’d like, and it should work for every project out there (probably most of the tools you’d need to build such a thing will be off the shelf, too, like pypa/build for creating isolated build environments).
OK. Thanks for clarifying. With my pip developer hat on, I await with interest the availability of the various support libraries so pip can decide which one (or more than one) to use to implement this logic. And of course I’m fine if only one reference implementation of the functionality is ever developed, we’ll just use that.
I don’t imagine pip ever implementing the logic itself - it’s clearly (to me) something that needs to be in a library, so that we don’t get trapped in the whole “implementation defined functionality” situation again (if someone did offer a PR to pip to implement this in-place I’d ask that it be split out as a reusable library).
That’s why I developed the editables library, to provide similar “off the shelf” functionality for backends in a PEP 660 world. But I won’t be doing the same for the “virtual wheel” proposal, because I don’t feel confident I know how to implement the logic. I’ll leave that to someone else who does feel comfortable taking on that task.
A project will work under the editable strategy its developer happens to use. The other strategy or strategies will be untested by the author and may not work. We don’t know whether a ‘strict’ or a ‘loose’ editable install would wind up being more popular, but we know the strategy setup.py develop uses. Under this proposal the more popular editable install strategy would be more likely to work given a randomly chosen project.
Suppose I’m distributing a package on github only and I like to use a ‘loose’ editable install. Other people are doing ‘strict’ editable installs off the main branch of my repository. I work on my package, then make sure it also works under a ‘strict’ editable install every time I commit to the main branch.
If you want a strict install, and you don’t mind typing ‘pip install’ every time you add or remove files, that is an ordinary install. You have to test the real wheel before you can expect your package to work off pypi.
Suppose we figure out a set of globs etc. for a precise ‘loose’ install. What we should then do to make this proposal work is remove the build_wheel hook and use the virtual wheel structure only. So pip can guarantee the rules used to make the ‘editable’ install match the distributed wheel.
Haven’t read all the rest of the replies since this one, because I’d like to ask you to stop expecting developers on Windows to be able to enable symlinks. That is simply not a viable assumption to make at the level we’re working at.
Happy to discuss another time all the reasons why, but I’m making this post a simple and direct request to drop this idea completely. Nothing we design can rely on symlinks being usable on Windows.
If you’re going to make the statement I’d like you to also publish all those reasons when you have time for it. Thanks! PS. I wasn’t saying to make symlinks the only way, but a possible way with some benefits when available.
This is correct, but in all the other times you’ve referred to it, you’ve never included this nuance.
Feel free to take advantage of symlinks if they’re enabled. Do not require them to be enabled or assume that they will be.
(Another answer is that the vast majority of student laptops these days are as locked down as corporate PCs, so anything that doesn’t work without symlinks - even if it barely works and “we’ll display a message telling them to enable them” - is going to exclude the next generation of devs.)
I’ve spent a little time creating a library for the frontend as suggested above, which you can find here. Hopefully, this will help inform our choice of editable installation and will form the basis of a more robust implementation should this PEP or an equivalent be adopted.