PEP 582 - Python local packages directory

Just now merged the changes to the draft PEP. I tested the one file implementation to enable PEP-582 in both Cpython & in PyPy.

You can test it via GitHub - kushaldas/pep582

Here is one example directory structure while installing two different set of packages in Cpython & PyPy ( is just an example script in the same top level directory).

├── __pypackages__
│  ├── bin
│  │  ├── __pycache__
│  │  │  └── bottle.cpython-310.pyc
│  │  └──
│  └── lib
│     ├── pypy3.9
│     │  └── site-packages
│     │     ├── h2
│     │     │  ├──
│     │     │  ├── __pycache__
│     │     │  │  ├── __init__.pypy39.pyc
│     │     │  │  ├── config.pypy39.pyc
│     │     │  │  ├── connection.pypy39.pyc
│     │     │  │  ├── errors.pypy39.pyc
│     │     │  │  ├── events.pypy39.pyc
│     │     │  │  ├── exceptions.pypy39.pyc
│     │     │  │  ├── frame_buffer.pypy39.pyc
│     │     │  │  ├── settings.pypy39.pyc
│     │     │  │  ├── stream.pypy39.pyc
│     │     │  │  ├── utilities.pypy39.pyc
│     │     │  │  └── windows.pypy39.pyc
│     │     │  ├──
│     │     │  ├──
│     │     │  ├──
│     │     │  ├──
│     │     │  ├──
│     │     │  ├──
│     │     │  ├──
│     │     │  ├──
│     │     │  ├──
│     │     │  └──
│     │     ├── h2-4.1.0.dist-info
│     │     │  ├── INSTALLER
│     │     │  ├── LICENSE
│     │     │  ├── METADATA
│     │     │  ├── RECORD
│     │     │  ├── REQUESTED
│     │     │  ├── top_level.txt
│     │     │  └── WHEEL
│     │     ├── hpack
│     │     │  ├──
│     │     │  ├── __pycache__
│     │     │  │  ├── __init__.pypy39.pyc
│     │     │  │  ├── exceptions.pypy39.pyc
│     │     │  │  ├── hpack.pypy39.pyc
│     │     │  │  ├── huffman.pypy39.pyc
│     │     │  │  ├── huffman_constants.pypy39.pyc
│     │     │  │  ├── huffman_table.pypy39.pyc
│     │     │  │  ├── struct.pypy39.pyc
│     │     │  │  └── table.pypy39.pyc
│     │     │  ├──
│     │     │  ├──
│     │     │  ├──
│     │     │  ├──
│     │     │  ├──
│     │     │  ├──
│     │     │  └──
│     │     ├── hpack-4.0.0.dist-info
│     │     │  ├── INSTALLER
│     │     │  ├── LICENSE
│     │     │  ├── METADATA
│     │     │  ├── RECORD
│     │     │  ├── top_level.txt
│     │     │  └── WHEEL
│     │     ├── hyperframe
│     │     │  ├──
│     │     │  ├── __pycache__
│     │     │  │  ├── __init__.pypy39.pyc
│     │     │  │  ├── exceptions.pypy39.pyc
│     │     │  │  ├── flags.pypy39.pyc
│     │     │  │  └── frame.pypy39.pyc
│     │     │  ├──
│     │     │  ├──
│     │     │  ├──
│     │     │  └── py.typed
│     │     └── hyperframe-6.0.1.dist-info
│     │        ├── INSTALLER
│     │        ├── LICENSE
│     │        ├── METADATA
│     │        ├── RECORD
│     │        ├── top_level.txt
│     │        └── WHEEL
│     └── python3.10
│        └── site-packages
│           ├── __pycache__
│           │  └── bottle.cpython-310.pyc
│           ├── bottle-0.12.23.dist-info
│           │  ├── AUTHORS
│           │  ├── INSTALLER
│           │  ├── LICENSE
│           │  ├── METADATA
│           │  ├── RECORD
│           │  ├── REQUESTED
│           │  ├── top_level.txt
│           │  └── WHEEL
│           └──

That’s not true FWIW, since you could just do:

$ python 3 -m pep582 -m blah

Learned something new, thank you :slight_smile:

Well it’s nothing special, everything after the -m pep582 gets interpreted by the hypothetical pep582 module. So there’s nothing stopping it from having it’s own -m argument that just calls runpy.run_module after setting up the environment.


By “part of Python”, do you mean “distributed with Python” (ie standard library) or “built in to the interpreter”? I think there’s value in distributing the script with Python, for users who shouldn’t need to or can’t download packages.

The biggest value is that you don’t need to install the package in every copy of Python you have around. But in my view, that says that what we need here is a way to globally install tools so they can be used in all Python interpreters. That would be beneficial in far more cases than just this, and would probably reduce some of the pressure on wanting things in the stdlib as a result.

I don’t have a particular preferred solution here, but a very simple approach would be to leverage zipapps, and add an option to Python that looks for zipapps on the os $PATH. So python --app foo would search $PATH for a zipapp called foo.pyz, and run it with the specified Python interpreter.

Then could be bundled into a zipapp, people could be instructed to add it to their $PATH, and it can be run via python --app pep582 And as I say, the benefit then isn’t limited to just pep582. For example, pip could be run as python --app pip (using the zipapp distribution of pip that’s hopefully coming in the next release). And there are likely plenty of other tools that would be useful as a zipapp, if the interpreter supported running them as easily as this.


The primary users of PEP582 are the people who are new to Python/programming and many times even new to computers. Teaching them creating a directory (or part of the empty git repository for the tutorial) and then follow along with the standard Python prompt is much easier than anything else we saw before.

This feature is not for the experienced users (most of the time), but helps the other 95% who are just starting with Python to get something done.


I think what bothers me is that it’s not just the interpreter that’s affected. In order to “just work” the way the PEP suggests, the whole ecosystem needs to be modified to automatically change behaviour if a __pypackages__ directory is detected. Installers (all of them, not just pip) need to install into that directory, IDEs need to look in there when setting up autocomplete, type checkers need to look there for .pyi files, etc. And how should tools that currently detect virtual environments work when a directory contains both .venv and __pypackages__? What about people working in a scratch directory, where they forgot that months ago they added a __pypackages__ directory, and try to pip install something globally? What should pip do if it’s being run from a virtualenv, but there’s a __pypackages__ directory? There’s just too many unanswered questions at this point.

IMO, it’s this need to change everything in order to achieve its goals that makes PEP 582 unrealistic. Nearly everything it proposes can be achieved already, if you’re willing to do it manually (modify sys.path at the start of your script, use pip install --target, etc). So the gain it offers comes largely from everything being automatic (as @kushaldas put it, “follow along with the standard Python prompt”). And realistically, I don’t imagine that will happen.

Personally, I’m very sympathetic to the view that we need to make it easier for newcomers to get started. But to do that, I think we need a much better understanding of how newcomers want things to work. And understanding that is something that UX specialists are much better at doing than developers. As part of the funded pip development in 2020, we had a UX team looking at how people used pip, and the results were full of useful insights. I think that if we want to improve the “new user experience” for Python, we should commission a user study like that (if we already had, and that’s what informed PEP 582, then I apologise - but I’m not aware of any such study).

Apart from the “newcomer experience”, the main other benefit I see for PEP 582 is to make it easier to write and share simple scripts that need more than just the stdlib. With an emphasis on “share” - I can hack something together for one off use with a temporary venv, but if I want to retain it for the future, or share it with someone else, I’m no longer just managing a single script file, but an “environment” (in some sense) that goes with that file. And that’s annoyingly hard (at least for fairly disorganised people like me). WIth __pypackages__, I can ship the script and the pypackages directory. Or rename the script to and zip it up with pypackages and it’s a valid zipapp. But honestly, I feel that if I were designing something to make that use case easier (“write and share a script along with its 3rd party dependencies”) I’d be looking for something that was different from PEP 582.

So overall, I’m coming to think that PEP 582 is the wrong solution here. It’s possible that we still need to experiment to find the right solution. For that, the suggestion of having a 3rd party that we can iterate on, and that people can use to activate the functionality, would be great. And if it’s too hard to make such a thing available, then maybe that is the problem we should solve first, so that we don’t have the issue of stuff like this needing to be implemented in the interpreter before they can be of any use…


For anyone attempting that: please don’t name it The number is only meaningful to a few people, and PEP-582-the-document is a draft now and will likely be frozen later. Don’t repeat pep8 or pep512.


When we started working on this PEP, @dstufft mentioned to me to make sure that we should not try to dictate how pip would install or behave to install the packages in the __pypackages__ directory. When Python itself will decide to use the directory for packages, the ecosystem will follow along and modify as required.

I still have hope on the community & the tool authors, that this will happen. We can already see PDM, we never pinged & asked the author to do it. But, because they saw the benefits, they went ahead. Even though the PEP is still in draft.

True, meanwhile the benefits of PEP-582 is something we (the people who trains newcomers regularly) see everytime we train. People come with various operating systems. When they are new to programming/computers, they really struggle to understand virtualenv. If you see the general view about this PEP, you will notice that why some folks are getting excited to have it now.

Um, OK. I can’t speak for what Donald meant by that, but personally I’d expect that we would require an explicit opt-in to installing into __pypackages__. That seems to directly contradict your hope that newcomers could “just follow existing instructions” and have things automatically use a local package directory.

Also, pip relies on sysconfig to define the layout of install schemes (we have special cases and exceptions to handle Linux distributions’ quirks, but we’re trying to move away from that to a model where distros patch sysconfig and pip just picks that up). So I would expect the layout of the local packages directory to be defined properly as a sysconfig sceme, if it’s going to get supported by pip.

Would the view be so positive if installers like pip didn’t support it? That’s a genuine question, I’m currently looking at the various options that pip has for specifying where things get installed (--target, --user, --prefix, --root) and it’s frankly a bit of a mess. I want to improve this, but in the short term that means that I’m very strongly against adding extra possibilities here, as it will make it even harder to fix properly. I don’t want PEP 582 to get accepted and then for pip to be under pressure to “support local package directories” immediately.

I absolutely 100% feel your pain here and it’s a super important use-case. But, the thing I don’t understand is: PEP 582 is a specific low-level mechanism; it’s not something beginners will be exposed to directly. They’ll need some PDM-like tool to manage the contents of their __pypackages__. So if they’re going to be using a tool like this anyway, why not let that tool handle launching python and setting up the environment appropriately? Wouldn’t that be even simpler than PEP 582, both to implement and for users?


I don’t remember, but I think I was mostly concerned that the PEP shouldn’t dictate the UX of how pip (or any installer) works. Like it can and should describe things like what the sysconfig scheme for it is, if there are any special considerations installers need to make to support this, etc.

But it shouldn’t mandate to installers things like:

  • Whether it should use a __pypackages__ directory by default if one exists.
  • Whether it should create a __pypackages__ directory by default if one doesn’t exist.
  • If it does the above, whether it should only look in CWD, or whether it should recurse upwards looking for one.
  • etc

To me those things are UX decisions for individual installer projects to manage. This PEP should be focused on the pieces that need coordination between multiple projects specific to the changes in this PEP, which I believe are largely going to be related to whatever mechanism is being used to add this directory to sys.path (whether that be as part of interpreter start up, or an explicit module, or whatever).

This focuses the PEP on the mechanics that are new to the PEP, and leaves how installers choose to expose this PEP (if they choose to expose it at all) up to those installers themselves.

I think if the major installers chose not to support PEP 582, then the bulk of the benefit of PEP 582 is more or less moot. For me, if there’s not buy in from at least some of the major tools that they want this, then it’s probably a strong signal that it’s either not selling itself well enough, or it’s the wrong idea for the ecosystem.

Of course, if some of the tools ultimately decide not to implement it, but the PEP gets accepted, it’s not really easy or possible to prevent users of that tool from asking for it to support that PEP.

(To be clear, I don’t have strong feelings one way or another about 582)

This is what sits badly with me. I find it hard to reconcile “the PEP shouldn’t dictate the UX of how pip works” against the fact that if pip doesn’t support it, most of the benefit of the PEP is lost. And it’s hard not to view that dilemma as confrontational (“people want the PEP so pip has to change even though the PEP doesn’t insist on it” vs “pip wants to be cautious about over-complicating its UX so the PEP will be useless until we decide otherwise”).

The PEP is pretty explicit that it expects pip to default to __pypackages__:

In another example scenario, a trainer of a Python class can say “Today we are going to learn how to use Twisted! To start, please checkout our example project, go to that directory, and then run python3 -m pip install twisted.”

That will install Twisted into a directory separate from python3.

And that’s the specific pip UI decision I want to push back on. If we’re going to take the position that the PEP shouldn’t dictate the pip UX, then that example needs to go. And without that example, we lose a lot of the stated benefits of the PEP.

To be clear here, I want pip to get better at managing “environments” that aren’t the site-packages of some Python installation. Managing __pypackages__ is certainly something we could include in that. But such a change is not going to happen quickly, and I definitely wouldn’t want to commit to __pypackages__ ever being the default environment that gets managed.

The more I think about this, the more I feel that the problems PEP 582 would address[1] are good ones to solve. But I think that PEP 582 is a bad way to solve them. I think that if we need a PEP, it should be one that enables third party solutions to these problems - we should be trying to get away from the situation where UX issues have to be solved in the core interpreter.

  1. New user experience, and sharing simple scripts. ↩︎

1 Like

Let’s look at what it takes to use __pypackages__ in PDM. If you do it locally, it’s pdm run. Otherwise you can do a [global setup via `](Working with PEP 582 - PDM). The latter solution is a non-starter for beginners (which I think we all agree struggle with virtual environments and installing things). I see enough users who don’t run `conda init` appropriately to expect them to set up their `` appropriately.

As for conda run, this then gets into the whole question of what is meant by “tool” above? As an example, how is VS Code supposed to know that in order to use the packages installed by the tool that it must use pdm run? Doing tool-specific support is not very sustainable (e.g. when I started on the extension we would have simply stopped at Poetry and pipenv, which leaves out PDM, Hatch, etc.). And while I can come up with a solution just for VS Code, that doesn’t help e.g. Emacs users.

Magically. :wink: I can tell you that too many people still install using just straight pip in a shell without understanding what that command means about installation location. And plenty of folks have no concept about environments (virtual or conda). And that’s even assuming you have that tooling installed as we all know of a popular Linux distro that lacks these sorts of tools out of the box (I believe that is somewhat fixed in their next release in a couple of years, but we also know knowledge of this change will also take a while to propagate out).

For me, the key thing here is we have a stack of potential tooling where there is a cascade of:

  1. Interpreter
  2. Environment
  3. Package installer
  4. Editor/IDE

Each tool and that list needs to know about the tool that came before it. Unfortunately the information doesn’t necessarily flow down to those that need it. PEP 582 somewhat helps with that by trying to get the whole stack to understand a single concept without having to necessarily communicate through the stack.


As both a fledgling Python dev, and as someone who teaches newcomers, anything that reduces friction, and gets me working, is very welcome.

Besides this is discoverable magic :slight_smile:
I can always delight my fellow learners to go “look in the box” that is the __pypackages__ directory, if they want to learn more

Well, one principle from the Zen is “Explicit is better than implicit”. Shouldn’t we be teaching that to beginners? So expanding the example in the PEP: “We’re going to start by creating a Python project. So open a command prompt, decide on a project name, and type pyproject create name”. That hypothetical pyproject command can do the setup that’s needed (which could, if you wanted, include creating a virtual environment). Then to run the project script, pyproject run.

This also admits the possibility that not everything is a Python “project” - something we should also be teaching beginners. You can use Python to write simple scripts, too - unlike some other languages, where everything has to be a project, in its own directory.

Oh, and if the issue is getting the users to install that pyproject command, then aren’t we back to the fact that one of the underlying problems here is that it’s too difficult to create utilities in Python that can easily be installed and managed by the user (Rust has cargo install, JS has npm install, maybe Python needs something like this)?

Don’t virtual environments also do exactly that? The only difference is that virtual environments are percieved as “too hard”. Something I don’t disagree with, but why is the response to replace them rather than to make them easier to use?

1 Like

For pupils in a school, just opening a command prompt and running a simple command is already a hurdle, and raises OS-specific questions: Powershell or cmd.exe? How was Python installed, do I need to run it as py or python or python3 or with a path? And the reaction might be “WTF” because the command line is “scary”, giving the impression that Python is harder to set up than it is.

Just my 2 cents.


Virtual environments communicate using environment variables that are set in the current terminal session process and are implicitly inherited by processes created by the user in that terminal session.

PEP 582 proposes to communicate via a fixed constant and some kind of “relative to another file” rule (currently undefined).

So yes, they both do it, and unfortunately making venv not rely on session environment variables is entirely breaking, because you then require some other kind of ambient state. Usually, people resolve this today by making their tools figure out where the venv is relative to the “current” file or the current working directory, which is more-or-less what PEP 582 is aiming to codify.

I don’t quite see what you mean here. It’s entirely possible to use a virtual environment without activating it, so there are no environment variables involved. The name of the virtual environment can be communicated by convention between the tools involved (.venv is the conventional name, so I’d assume we’d go for that). The difference is that the interpreter doesn’t need to be involved, so the name can just be an agreed convention between tools, rather than a PEP.

But I’m not that bothered - I’m not wedded to requiring virtual environments, just pointing out that a suite of tools plus a convention should be sufficient without needing a PEP or language change.