PEP 582 - Python local packages directory

I have to agree on this point. I use venvs extensively, but I never
“activate” them. I either just call their python3 directly, or other
entrypoints from their bin directories, and have never encountered a
problem with that.

Maybe we should document pipx more prominently?

Try describing how to make pipx work for a newcomer from scratch; pipx is on the right track, but Python still lacks a good bootstrapping story. I don’t like how PEP 582 is currently solving it, but the described problem is real.

3 Likes

I’m not clear how this would interact with existing environment mechanisms (virtualenv, conda, etc.). Whichever mechanism takes precedence over the other, I worry that in trying to make things easier for beginners, it will make things harder a bit further down the line, because it’s another ‘why can’t I import the version I just installed’ trap - i.e. you might install things in one place and import them from the other.

It feels like this would be a pretty good proposal in a world where we didn’t already have an environment mechanism. But we’ve got multiple mechanisms already, and we’ve been pushing people to use them for the last decade or so. I think the PEP really needs to describe in some detail how __pypackages__ would interact with the existing environment mechanisms, and where that might catch people out.

(Also, I can presently launch a subprocess using sys.executable and mostly expect it to see the same packages as the parent process. I’m aware of some significant caveats, but this is still a useful technique. __pypackages__ would make this more complex. I’m sure we’d figure that out, but code that currently works would end up broken.)

10 Likes

And so is “Simple is better than complex” and “practicality beats purity”.

Maybe, but how easy do you expect that discussion to be? :wink: If people want to define the platypus, we can have that discussion (separately).

No, because virtual environments don’t necessarily speak to installation (i.e. Hatch handles the installation into virtual environments, not the other way around).

No one has come up with a better solution.

If we don’t all come up with a good solution then I will have to come up with one in VS Code that not all of you like and leaves out other editors (and that’s not meant to come off as a threat, more that I know not all of you like my opinions :wink:). That’s not something I want to do, but I will do it if that’s what it takes to get Python developers a good experience in VS Code. But I would much rather work based on standards as much as possible, as that means it’s less about my opinion and more about the community’s opinion.

6 Likes

(Newcomer to this thread. The thread is a little long, so please tolerate my quick summary)

We want to make environment management “magically work” for beginners. In particular, the manual activation and deactivation of venvs is a painpoint.

Constraints mentioned in this thread:

  • Avoid dictating the UX of pip (or other installers)
  • Avoid complex interactions with existing environment mechanisms (venv, conda)
  • Avoid breaking people (code that uses sys.executable, beliefs about how pip should install things)
  • Minimise special casing for tools that need to understand Python environments (VSCode, other IDEs, type checkers, linters, etc)

njs’ mentions users will need a tool to manage their environment, so the tool could launch python and set up the environment as well. This feels very promising to me and seems to solve most of the constraints above:

  • The user interacts with the project management tool of their choice. This lets them get a cohesive UX, without placing constraints on other tools and installers. It also allows for more innovation in the project management space than if we had a single PEP-blessed UX.
  • We aren’t inventing a new environment mechanism or layout. E.g., poetry just uses virtual environments to do the job.
  • No code is broken! (We use existing environment mechanisms and don’t change the behaviour of the interpreter or sys.executable or pip)
  • It’s a pattern that exists in tools today, e.g. poetry has poetry run, poetry shell, poetry init, etc.

Unfortunately, the last constraint about minimising special casing of tools through the stack needs to be resolved. But rather than standardise a whole new environment mechanism, it seems easier to standardise communication of environments between tools!

…I might regret this, but here’s a proposal for what this could look like that should look pretty similar to __pypackages__ (apologies for all its inevitable flaws and shortcomings. I can only hope that it’s not worse than whatever Brett unilaterally decides for VSCode :wink: ):

A “.venv” file in the project root that contains a path to a virtual environment

Any project management tool can have whatever UX it likes, but behind the scenes it should manage a virtual environment somewhere, and create a “.venv” file in project root. IDEs can then know to look in that venv. It’s trivial for existing project management tools to adopt. An installer could decide to install things in that environment.

Not-so-coincidentally, this is what my setup uses today. I have a shell plugin that’s a basically a tiny project management tool. It looks for a “.venv” file and automatically activates and deactivates the environment as my cwd changes. It also has commands to easily create and delete venvs, e.g. suggesting to create a new venv when entering a folder that looks like a Python project.

My partner is a Python novice, but I set this up for her, and things mostly just work. In this n=1 experience, she’s stumbled on two things:

  • First, she has to manually “Select interpreter” in VSCode for new projects. The proposal^ would allow VSCode to automatically select the interpreter, getting rid of this step.

  • Second, she often uses Python as a standalone tool outside of a project context. She was surprised when packages she’d installed in a project weren’t available to her – like many beginners, she assumed a single global environment. This required a one-time explanation. Using a project management tool that wraps most commands would likely have made this more obvious.

  • Bonus implicit third, I had to set this up for her. This calls back to the separate thread alluded to above about bootstrapping installs. But this isn’t insurmountable, e.g. I feel poetry has a pretty nice bootstrap experience, and if pipx enters mainstream consciousness that would work too.

3 Likes

To reiterate, needing “manual activation and deactivation of venvs”
seems to be a gross misunderstanding, or perhaps cargo-culting from
instructions which claim doing so is necessary. I never “activate”
or “deactivate” a venv. I have a tree of them in a lib subdirectory
of my homedir isolating a variety of tools installed from PyPI. In
my ~/bin is a symlink farm exposing executables for those tools from
their respective venvs. With ~/bin automatically added to my $PATH
through typical shell initialization config, I can just invoke the
name of whatever tool I need to run, as surely as if it were
something in /usr/bin on the system.

2 Likes

That’s huge. On macOS, most package directories are really well hidden in some corner of the file system, which makes poking into them a bit cumbersome at first. Since __pypackages__ would be visible (unlike the typical venv in .venv) and still close by (unlike system Python, homebrew Python, pyenv Python) ,

Please, no! The whole beauty of __pypackages__ is that it shouldn’t require initialization or setup. It just gets created. That would be huge not just for learners but everyone else as well. Yes, setting up a venv or pyenv isn’t too hard. But I have to remember it and deal with it even when trying out just one package’s functions in the REPL.

The combination of batteries-included (no more heroic pdm hackery) with no-setup-required is extremely compelling magic. That it also is easily discoverable magic because it is close-by and visible in the file system positively distinguishes __pypackages__ from everything else. That makes PEP-582 really compelling in my mind.

I also don’t see how implementing it would be so hard. As several have pointed out, all the ingredients are already there. It’s just getting a few lines into the startup scripts for the interpreter and supporting one more variation of a sysconfig theme. Why does this seem unreasonable?

1 Like

I think I need a restatement here of what we’re trying to solve. What precisely in the current VS Code experience is causing sufficient problems that you’re willing to implement a unilateral solution if we don’t agree a standard? And how do you expect to make such a solution work smoothly if tools don’t change to support it? (Because to be frank, a solution that needs no python or ecosystem changes sounds good to me…)

I think that this could be a good solution but why the indirection? The solution should be simple.

.venv directory as the default venv is being recognized by multiple tools already including VS Code (which I use). If indirection is needed let’s use the standard way - symlinks. If the platform has problems with symlinks then a regular file can be treated as a symlink (i.e. your original suggestion).

Note: I am just afraid that venvs break when you change their path (when you move them). Do I remember this correctly? If so I think they should be made relocatable if it is not extremely complicated.

They do. It’s essentially because the entry point scripts contain the absolute path of the Python executable). This is intended, because copying just those scripts while leaving the venv where it is, is a supported use case (pipx uses it, for example).

Actually, handling entry point scripts would be a problem for PEP 582 as well. The script is located within __pypackages__, so how would it know to add that location to sys.path?

You are talking about changing the default behaviour of pip is a backward-incompatible way. Ignoring the technical aspects (which aren’t as trivial as you suggest, but can be worked around if we had to) I have no idea how we’d even begin to assess the impact this would have on our user base. Particularly as some tools already support __pypackages__ so we can’t even assume that it’s unlikely anyone will have a directory of that name.

PDM solves this by searching for __pypackages__ in the parent directories. Specifically, PDM finds __pypackages__ in the following order.

  1. __pypackages__ in the same directory as the script to run(if any)
  2. __pypackages__ in the parent and ancestor directories with a configurable max depth
  3. __pypackages__ in the current working directory if not running scripts, such as python or python -m.
1 Like

In this PEP that is why we said that we only support running modules, not entry scripts, so all the projects allow to do python3 -m modulename will just work.

2 Likes

Right but this requires a tool, users can’t just type the command.

That’s poor UX. Also, defining entry points does not mean the project has a __main__.

1 Like

This PEP is not replacing virtual environments right away. When someone is ready to use more tools, they are also generally ready to use and understand virtual environment.

Users don’t know about virtual environments. They also don’t know where pip installs to. Combine the two and we get a lot of “why am I getting an ImportError when I installed using pip?” This is also ignoring the fact that people are installing globally without realizing it. We can obviously help guide users towards creating environments (and we probably will, especially if this PEP doesn’t go anywhere), but not having to do that step or extra layer of complexity would be nice.

Probably by defining our own API for other extensions to do the right thing to then plug into our UX (e.g. a Hatch extension that would get called when we hit a scenario that a user should be doing in a virtual environment). Then we make the difficulty of creating such an extension as dirt-cheap as possible (see our repo template for creating extensions for linters and formatters). Then we either work with the tool maintainers or seed the community with extensions supporting the most-used tools (e.g. our Pylint and Black extensions).

And we would directly support venv ourselves since it’s (practically) in every Python install so the baseline is always there (see the WWBD extension for an initial stab at a Create Environment command).

:smile: I hear you, and I’m not opposed to going that route, but I’m also not interested in differentiating VS Code that way specifically. If we can help e.g. vim users, I’m happy to work towards that first and foremost. Otherwise we can come up with ideas and I can share them here to garner feedback, but that does put it in a “box” within VS Code instead of wider sharing/use.

3 Likes

I understand that, but I’m still not clear how it’s a VS Code problem, or how VS Code will fix it (unless you write your own installer or wrap pip, which is what I thought you were suggesting). Because as long as users have to use pip to install, it’s the pip experience (for better or worse) that impacts them. Which is why I’d rather a solution that works with pip, than one that needs pip to change to accommodate it. There’s just too many users of pip to lightly consider changing its default behaviour IMO.

Because we get the “bug” reports that somehow auto-complete is broken or our terminal isn’t doing the right thing. :sweat_smile: It’s also a VS Code problem because I’m being asked to make the getting started experience for Python developers as good as I can.

There’s a couple of options that effectively revolve around forcing people into virtual environments. One is to try to get people into an environment early so that they get an activated terminal and thus make sure pip points at the one in the virtual environment (which is a start-up performance issue since creating environments on Windows is slow). The other is to provide a GUI around installation and package management (which is problematic as everyone has an opinion/command on how to do that).

So when looking at this PEP, the wins from a VS Code perspective are:

  1. No need to create an environment (which is slow on Windows, especially for people on old/low-end hardware)
  2. People wouldn’t necessarily use pip to install into the wrong place
  3. We know exactly where to look for the Python “environments” for the workspace (compared to now where we have to know what tool they used and then we have to ask the tool to tell us)
3 Likes

Not at all anymore with virtualenv rather than stdlib venv.