PEP 582 - Python local packages directory

FWIW I still think PEP 582 is a bad approach, for all the reasons we talked about before. Do you think there new developments that change that calculus?

I think the general community (tooling) support for the PEP even though it isn’t accepted and people continuously asking for it suggests it’s pragmatically good enough for folks. Otherwise we will have to see what the PEP update contains.


I wonder if @frostming can provide some insights given that PDM has supported it for a while now. FWIW, I find it quite handy for development.

I think there’s still a need, and the fact that no-one has come up with an alternative approach in the time PEP 582 has been on hold suggests that we should at least accept that the choice is more likely to be between PEP 582 and nothing, and not between PEP 582 and “something better”.

Personally, I’ve become more supportive of PEP 582 for exactly that reason. I want to look at the details again, but I’m inclined not to let the perfect be the enemy of the good.

I can’t say I’ve noticed a bunch of tools that let me use a local packages directory. Did I miss them?


I know of PDM and PyFlow off the top of my head. My (potentially faulty) memory may also be stemming from the fact that every tool I am involved with that has anything to do with environments has been asked to support __pypackages__. So either PDM is way more popular than we all realize or something else is supporting it that I can’t remember off the top of my head that makes folks ask for its support.

1 Like

I think it’s just familiarity with other ecosystems like JS.

If this gets accepted, people will start committing the directory to version control systems like Git on day 1, which scares me greatly.


Is that somehow worse than people committing virtual environments? You’ll find plenty of site-packages and pyvenv.cfg files as well on any major public version control hosting provider.


No, but now standardized rather than discouraged so will be far more common and not (mostly) accidental.


It’s not retroactive, but the standard .gitignore already excludes __py_packages__

GitHub (probably others) also warns about large individual file uploads. Personally I think many tools write potentially large “working” files into the source tree, so users know to look out for it when staged. Can also be cleaned up later.

I think it’s probably possible to craft a gitignore in __py_packages__ itself that prevents it being staged. That will make it more difficult for users who DO intentionally wish to commit the folder however.


But I mean… This is what proves that the PEP is a bad idea. Tools like PDM/PyFlow/Hatch/etc. can already accomplish the same thing the PEP does, but better in every way. (Lockfiles, running outside the project root, managing multiple environments, managing project scripts, alternative disk layouts that avoid venv inefficiencies, …)

In other words: why would you want an awkward half-working implementation of PDM jammed into CPython where it can never change, Instead of using PDM?


FYI for those mainly wanting venv standardization for IDEs and other tools, this is the better way: Support a plug-in system to assist in environment/interpreter discovery · Discussion #168 · brettcannon/python-launcher · GitHub

1 Like

That’s a good point. Maybe the PEP should instead standardize what a tool supporting __pypackages__ needs to do, rather than propose it for Python itself.

Maybe it’s also worth comparing built-in __pypackages__ to built-in venv support? Why do we feel like the latter should go into Python but not the former? FWIW, I really prefer python3 -m venv over the third party virtualenv, even despite the more limited feature set. I should try to figure out why :smile:

Maybe it’s because I really don’t need the feature provided by virtualenv, and having python -m venv right there is good enough. I like and use abstractions built on top of venv, such as tox, and when I actually need to create a venv, I just want something quick and dirty (and more ephemeral) to experiment with. But it’s also a bit of a hassle either way. OTOH, this could be a useful lesson in the other direction, i.e. python -m venv is a subset of virtualenv and it’s good enough, so maybe a stripped down PEP 582 support in the stdlib will be good enough for most cases too, except when you want and need all the additional things those other tools give you?


Because PEP 582 is not trying to replace PDM in Cpython. Instead it allows Cpython to pickup __pypackages__ directory and use it as part of path to search for packages. The tool which can be used to install packages on that directory is totally users’ choice. Also this PEP currently not trying to tell the tool (say pip or pdm) how it should install there.


Thinking about this, I like the idea that it’s a scale of gradually more stripped down approaches. My interest in PEP 582 is definitely for cases where I find venvs a bit too heavyweight (in particular, the need to use the venv’s Python interpreter, and the fact that copying a venv around doesn’t work). For those cases, a directory of dependencies is just about right (and it has the advantage that it can easily be zipped up to make a zipapp).

Yes, this can be done manually right now. It’s basically a matter of adding sys.path.insert(0, os.path.join(os.path.dirname(__file__), "__pypackages__") to the top of all your scripts that you want to use the feature. But having to do this manually feels just a little too “stripped down”. Having a standard location that’s added by default is a convenience, but IMO that’s the point - it’s convenient. And for cases where people prefer a different directory name, or otherwise want a slight variation, they still have the option to drop down to the sys.path.insert approach.

It also gives a standard location for zipapps that bundle their dependencies to put them, which again is not necessary, but is convenient.

The rest of the PEP, I’m less enthusiastic about. I don’t really see a point in the {python_version}/lib subdirectory of __pypackages__ (which the PEP uses in examples, but doesn’t seem to require in the text). I’m uncomfortable about broadly mandating that “package management tools” should automatically install to __pypackages__ if it’s present. But I would like tools to be better able to manage such a directory (I have a longstanding desire to improve pip’s --target option to cover this). I guess this is where PDM’s existing support fits in?

I think the PEP needs to discuss other stdlib modules, as well. The sysconfig module might need a pypackages scheme - or would the home scheme work (in which case what about Windows, as there’s only a posix_home, not an nt_home)?

1 Like

It’s worth noting that the revamped PDM 2.0 chose to use virtual environments by default What's New In PDM 2.0? | Frost's Blog


This is my feeling as well. My take away from previous discussions is that most people are totally OK with the __pypackages__ directory[1] (it is “just” a different layout of a virtual environment, after all), but many feel uneasy that it is automatically “activated” by merely running the interpreter.

From a tool builder’s perpective, however, I can see why it’s tempting to want the interpreter to directly support this. Environment isolation and activation is, fundamentally, modifying sys.path, and Python currently offers very few choices for a wrapper tool to do this, and none of them are even good for that. The most obvious approach is to set PYTHONPATH (pythonloc and Pyflow use this), which is extremely limiting and has a ton of problems (the most important one being the global site-packages leaks into the “environment”). PDM uses some sitecustomize hack to mimic virtual environment behaviour (i.e. actually isolated!), but if you ever tried to write a sitecustomize you’d know how messy that is, especially if the interpreter you want to overlay on already has a custom site or sitecustomize, which the user would expect you to be able to inherit. If anyone remembers, this is actually one big thing about PEP 405—it enabled virtualenv to drop a ton of annoying hacks (that don’t work all the time).

I think the real fundamental problem is that Python does not offer a good way for a tool like PDM or Pyflow to customise sys.path, and adding the right hooks to the interpreter would better empower an entire class of tools to do good things that the core interpreter may not want to. Of course, it’d still be a good idea to have standardisation around this to avoid fragmentation and user confusion, like how packaging tools move to PEPs 517, 518, 621, etc.

  1. I’m actually one of the exceptions that do have issues with the current directory structure, but that’s not the point here so let’s skip it for now. ↩︎

1 Like

This is a good compromise, and something I’m not completely against unlike the current proposal.

As am I quite frankly. But I think that’s solvable, either through another command line option, or a -m invocable module. In this case, EIBTI. Would this be so bad?

$ python3 -m pep582

(modulo of course bikeshedding on the name.

If you’re activating it in that way, there’s no requirement to have pep582 be part of Python.

Then we will also loose the power of using any other module installed in the __pypackages__ via __main__.