FWIW I still think PEP 582 is a bad approach, for all the reasons we talked about before. Do you think there new developments that change that calculus?
I think the general community (tooling) support for the PEP even though it isnāt accepted and people continuously asking for it suggests itās pragmatically good enough for folks. Otherwise we will have to see what the PEP update contains.
I wonder if @frostming can provide some insights given that PDM has supported it for a while now. FWIW, I find it quite handy for development.
I think thereās still a need, and the fact that no-one has come up with an alternative approach in the time PEP 582 has been on hold suggests that we should at least accept that the choice is more likely to be between PEP 582 and nothing, and not between PEP 582 and āsomething betterā.
Personally, Iāve become more supportive of PEP 582 for exactly that reason. I want to look at the details again, but Iām inclined not to let the perfect be the enemy of the good.
I canāt say Iāve noticed a bunch of tools that let me use a local packages directory. Did I miss them?
I know of PDM and PyFlow off the top of my head. My (potentially faulty) memory may also be stemming from the fact that every tool I am involved with that has anything to do with environments has been asked to support __pypackages__
. So either PDM is way more popular than we all realize or something else is supporting it that I canāt remember off the top of my head that makes folks ask for its support.
I think itās just familiarity with other ecosystems like JS.
If this gets accepted, people will start committing the directory to version control systems like Git on day 1, which scares me greatly.
Is that somehow worse than people committing virtual environments? Youāll find plenty of site-packages and pyvenv.cfg files as well on any major public version control hosting provider.
No, but now standardized rather than discouraged so will be far more common and not (mostly) accidental.
Itās not retroactive, but the standard .gitignore already excludes __py_packages__
GitHub (probably others) also warns about large individual file uploads. Personally I think many tools write potentially large āworkingā files into the source tree, so users know to look out for it when staged. Can also be cleaned up later.
I think itās probably possible to craft a gitignore in __py_packages__
itself that prevents it being staged. That will make it more difficult for users who DO intentionally wish to commit the folder however.
But I meanā¦ This is what proves that the PEP is a bad idea. Tools like PDM/PyFlow/Hatch/etc. can already accomplish the same thing the PEP does, but better in every way. (Lockfiles, running outside the project root, managing multiple environments, managing project scripts, alternative disk layouts that avoid venv inefficiencies, ā¦)
In other words: why would you want an awkward half-working implementation of PDM jammed into CPython where it can never change, Instead of using PDM?
FYI for those mainly wanting venv standardization for IDEs and other tools, this is the better way: Support a plug-in system to assist in environment/interpreter discovery Ā· Discussion #168 Ā· brettcannon/python-launcher Ā· GitHub
Thatās a good point. Maybe the PEP should instead standardize what a tool supporting __pypackages__
needs to do, rather than propose it for Python itself.
Maybe itās also worth comparing built-in __pypackages__
to built-in venv support? Why do we feel like the latter should go into Python but not the former? FWIW, I really prefer python3 -m venv
over the third party virtualenv
, even despite the more limited feature set. I should try to figure out why
Maybe itās because I really donāt need the feature provided by virtualenv
, and having python -m venv
right there is good enough. I like and use abstractions built on top of venv, such as tox
, and when I actually need to create a venv, I just want something quick and dirty (and more ephemeral) to experiment with. But itās also a bit of a hassle either way. OTOH, this could be a useful lesson in the other direction, i.e. python -m venv
is a subset of virtualenv
and itās good enough, so maybe a stripped down PEP 582 support in the stdlib will be good enough for most cases too, except when you want and need all the additional things those other tools give you?
Because PEP 582 is not trying to replace PDM in Cpython. Instead it allows Cpython to pickup __pypackages__
directory and use it as part of path to search for packages. The tool which can be used to install packages on that directory is totally usersā choice. Also this PEP currently not trying to tell the tool (say pip
or pdm
) how it should install there.
Thinking about this, I like the idea that itās a scale of gradually more stripped down approaches. My interest in PEP 582 is definitely for cases where I find venvs a bit too heavyweight (in particular, the need to use the venvās Python interpreter, and the fact that copying a venv around doesnāt work). For those cases, a directory of dependencies is just about right (and it has the advantage that it can easily be zipped up to make a zipapp).
Yes, this can be done manually right now. Itās basically a matter of adding sys.path.insert(0, os.path.join(os.path.dirname(__file__), "__pypackages__")
to the top of all your scripts that you want to use the feature. But having to do this manually feels just a little too āstripped downā. Having a standard location thatās added by default is a convenience, but IMO thatās the point - itās convenient. And for cases where people prefer a different directory name, or otherwise want a slight variation, they still have the option to drop down to the sys.path.insert
approach.
It also gives a standard location for zipapps that bundle their dependencies to put them, which again is not necessary, but is convenient.
The rest of the PEP, Iām less enthusiastic about. I donāt really see a point in the {python_version}/lib
subdirectory of __pypackages__
(which the PEP uses in examples, but doesnāt seem to require in the text). Iām uncomfortable about broadly mandating that āpackage management toolsā should automatically install to __pypackages__
if itās present. But I would like tools to be better able to manage such a directory (I have a longstanding desire to improve pipās --target
option to cover this). I guess this is where PDMās existing support fits in?
I think the PEP needs to discuss other stdlib modules, as well. The sysconfig
module might need a pypackages
scheme - or would the home
scheme work (in which case what about Windows, as thereās only a posix_home
, not an nt_home
)?
Itās worth noting that the revamped PDM 2.0 chose to use virtual environments by default What's New In PDM 2.0? | Frost's Blog
This is my feeling as well. My take away from previous discussions is that most people are totally OK with the __pypackages__
directory[1] (it is ājustā a different layout of a virtual environment, after all), but many feel uneasy that it is automatically āactivatedā by merely running the interpreter.
From a tool builderās perpective, however, I can see why itās tempting to want the interpreter to directly support this. Environment isolation and activation is, fundamentally, modifying sys.path
, and Python currently offers very few choices for a wrapper tool to do this, and none of them are even good for that. The most obvious approach is to set PYTHONPATH
(pythonloc
and Pyflow use this), which is extremely limiting and has a ton of problems (the most important one being the global site-packages
leaks into the āenvironmentā). PDM uses some sitecustomize
hack to mimic virtual environment behaviour (i.e. actually isolated!), but if you ever tried to write a sitecustomize
youād know how messy that is, especially if the interpreter you want to overlay on already has a custom site
or sitecustomize
, which the user would expect you to be able to inherit. If anyone remembers, this is actually one big thing about PEP 405āit enabled virtualenv
to drop a ton of annoying hacks (that donāt work all the time).
I think the real fundamental problem is that Python does not offer a good way for a tool like PDM or Pyflow to customise sys.path
, and adding the right hooks to the interpreter would better empower an entire class of tools to do good things that the core interpreter may not want to. Of course, itād still be a good idea to have standardisation around this to avoid fragmentation and user confusion, like how packaging tools move to PEPs 517, 518, 621, etc.
Iām actually one of the exceptions that do have issues with the current directory structure, but thatās not the point here so letās skip it for now. ā©ļø
This is a good compromise, and something Iām not completely against unlike the current proposal.
many feel uneasy that it is automatically āactivatedā by merely running the interpreter
As am I quite frankly. But I think thatās solvable, either through another command line option, or a -m
invocable module. In this case, EIBTI. Would this be so bad?
$ python3 -m pep582 blah.py
(modulo of course bikeshedding on the pep582.py
name.
As am I quite frankly. But I think thatās solvable, either through another command line option, or a
-m
invocable module. In this case, EIBTI. Would this be so bad?$ python3 -m pep582 blah.py
(modulo of course bikeshedding on the
pep582.py
name.
If youāre activating it in that way, thereās no requirement to have pep582
be part of Python.
As am I quite frankly. But I think thatās solvable, either through another command line option, or a
-m
invocable module. In this case, EIBTI. Would this be so bad?$ python3 -m pep582 blah.py
(modulo of course bikeshedding on the
pep582.py
name.
Then we will also loose the power of using any other module installed in the __pypackages__
via __main__
.