PEP 582 - Python local packages directory

FWIW I still think PEP 582 is a bad approach, for all the reasons we talked about before. Do you think there new developments that change that calculus?

I think the general community (tooling) support for the PEP even though it isnā€™t accepted and people continuously asking for it suggests itā€™s pragmatically good enough for folks. Otherwise we will have to see what the PEP update contains.

5 Likes

I wonder if @frostming can provide some insights given that PDM has supported it for a while now. FWIW, I find it quite handy for development.

I think thereā€™s still a need, and the fact that no-one has come up with an alternative approach in the time PEP 582 has been on hold suggests that we should at least accept that the choice is more likely to be between PEP 582 and nothing, and not between PEP 582 and ā€œsomething betterā€.

Personally, Iā€™ve become more supportive of PEP 582 for exactly that reason. I want to look at the details again, but Iā€™m inclined not to let the perfect be the enemy of the good.

I canā€™t say Iā€™ve noticed a bunch of tools that let me use a local packages directory. Did I miss them?

3 Likes

I know of PDM and PyFlow off the top of my head. My (potentially faulty) memory may also be stemming from the fact that every tool I am involved with that has anything to do with environments has been asked to support __pypackages__. So either PDM is way more popular than we all realize or something else is supporting it that I canā€™t remember off the top of my head that makes folks ask for its support.

1 Like

I think itā€™s just familiarity with other ecosystems like JS.

If this gets accepted, people will start committing the directory to version control systems like Git on day 1, which scares me greatly.

3 Likes

Is that somehow worse than people committing virtual environments? Youā€™ll find plenty of site-packages and pyvenv.cfg files as well on any major public version control hosting provider.

5 Likes

No, but now standardized rather than discouraged so will be far more common and not (mostly) accidental.

2 Likes

Itā€™s not retroactive, but the standard .gitignore already excludes __py_packages__

GitHub (probably others) also warns about large individual file uploads. Personally I think many tools write potentially large ā€œworkingā€ files into the source tree, so users know to look out for it when staged. Can also be cleaned up later.

I think itā€™s probably possible to craft a gitignore in __py_packages__ itself that prevents it being staged. That will make it more difficult for users who DO intentionally wish to commit the folder however.

3 Likes

But I meanā€¦ This is what proves that the PEP is a bad idea. Tools like PDM/PyFlow/Hatch/etc. can already accomplish the same thing the PEP does, but better in every way. (Lockfiles, running outside the project root, managing multiple environments, managing project scripts, alternative disk layouts that avoid venv inefficiencies, ā€¦)

In other words: why would you want an awkward half-working implementation of PDM jammed into CPython where it can never change, Instead of using PDM?

3 Likes

FYI for those mainly wanting venv standardization for IDEs and other tools, this is the better way: Support a plug-in system to assist in environment/interpreter discovery Ā· Discussion #168 Ā· brettcannon/python-launcher Ā· GitHub

2 Likes

Thatā€™s a good point. Maybe the PEP should instead standardize what a tool supporting __pypackages__ needs to do, rather than propose it for Python itself.

Maybe itā€™s also worth comparing built-in __pypackages__ to built-in venv support? Why do we feel like the latter should go into Python but not the former? FWIW, I really prefer python3 -m venv over the third party virtualenv, even despite the more limited feature set. I should try to figure out why :smile:

Maybe itā€™s because I really donā€™t need the feature provided by virtualenv, and having python -m venv right there is good enough. I like and use abstractions built on top of venv, such as tox, and when I actually need to create a venv, I just want something quick and dirty (and more ephemeral) to experiment with. But itā€™s also a bit of a hassle either way. OTOH, this could be a useful lesson in the other direction, i.e. python -m venv is a subset of virtualenv and itā€™s good enough, so maybe a stripped down PEP 582 support in the stdlib will be good enough for most cases too, except when you want and need all the additional things those other tools give you?

2 Likes

Because PEP 582 is not trying to replace PDM in Cpython. Instead it allows Cpython to pickup __pypackages__ directory and use it as part of path to search for packages. The tool which can be used to install packages on that directory is totally usersā€™ choice. Also this PEP currently not trying to tell the tool (say pip or pdm) how it should install there.

3 Likes

Thinking about this, I like the idea that itā€™s a scale of gradually more stripped down approaches. My interest in PEP 582 is definitely for cases where I find venvs a bit too heavyweight (in particular, the need to use the venvā€™s Python interpreter, and the fact that copying a venv around doesnā€™t work). For those cases, a directory of dependencies is just about right (and it has the advantage that it can easily be zipped up to make a zipapp).

Yes, this can be done manually right now. Itā€™s basically a matter of adding sys.path.insert(0, os.path.join(os.path.dirname(__file__), "__pypackages__") to the top of all your scripts that you want to use the feature. But having to do this manually feels just a little too ā€œstripped downā€. Having a standard location thatā€™s added by default is a convenience, but IMO thatā€™s the point - itā€™s convenient. And for cases where people prefer a different directory name, or otherwise want a slight variation, they still have the option to drop down to the sys.path.insert approach.

It also gives a standard location for zipapps that bundle their dependencies to put them, which again is not necessary, but is convenient.

The rest of the PEP, Iā€™m less enthusiastic about. I donā€™t really see a point in the {python_version}/lib subdirectory of __pypackages__ (which the PEP uses in examples, but doesnā€™t seem to require in the text). Iā€™m uncomfortable about broadly mandating that ā€œpackage management toolsā€ should automatically install to __pypackages__ if itā€™s present. But I would like tools to be better able to manage such a directory (I have a longstanding desire to improve pipā€™s --target option to cover this). I guess this is where PDMā€™s existing support fits in?

I think the PEP needs to discuss other stdlib modules, as well. The sysconfig module might need a pypackages scheme - or would the home scheme work (in which case what about Windows, as thereā€™s only a posix_home, not an nt_home)?

1 Like

Itā€™s worth noting that the revamped PDM 2.0 chose to use virtual environments by default What's New In PDM 2.0? | Frost's Blog

4 Likes

This is my feeling as well. My take away from previous discussions is that most people are totally OK with the __pypackages__ directory[1] (it is ā€œjustā€ a different layout of a virtual environment, after all), but many feel uneasy that it is automatically ā€œactivatedā€ by merely running the interpreter.

From a tool builderā€™s perpective, however, I can see why itā€™s tempting to want the interpreter to directly support this. Environment isolation and activation is, fundamentally, modifying sys.path, and Python currently offers very few choices for a wrapper tool to do this, and none of them are even good for that. The most obvious approach is to set PYTHONPATH (pythonloc and Pyflow use this), which is extremely limiting and has a ton of problems (the most important one being the global site-packages leaks into the ā€œenvironmentā€). PDM uses some sitecustomize hack to mimic virtual environment behaviour (i.e. actually isolated!), but if you ever tried to write a sitecustomize youā€™d know how messy that is, especially if the interpreter you want to overlay on already has a custom site or sitecustomize, which the user would expect you to be able to inherit. If anyone remembers, this is actually one big thing about PEP 405ā€”it enabled virtualenv to drop a ton of annoying hacks (that donā€™t work all the time).

I think the real fundamental problem is that Python does not offer a good way for a tool like PDM or Pyflow to customise sys.path, and adding the right hooks to the interpreter would better empower an entire class of tools to do good things that the core interpreter may not want to. Of course, itā€™d still be a good idea to have standardisation around this to avoid fragmentation and user confusion, like how packaging tools move to PEPs 517, 518, 621, etc.


  1. Iā€™m actually one of the exceptions that do have issues with the current directory structure, but thatā€™s not the point here so letā€™s skip it for now. ā†©ļøŽ

2 Likes

This is a good compromise, and something Iā€™m not completely against unlike the current proposal.

As am I quite frankly. But I think thatā€™s solvable, either through another command line option, or a -m invocable module. In this case, EIBTI. Would this be so bad?

$ python3 -m pep582 blah.py

(modulo of course bikeshedding on the pep582.py name.

If youā€™re activating it in that way, thereā€™s no requirement to have pep582 be part of Python.

Then we will also loose the power of using any other module installed in the __pypackages__ via __main__.