PEP 704 - Require virtual environments by default for package installers

I agree with @petersuter here. This does not solve any of the problems with virtual environments. People will still upgrade their OS and end up with a broken venv with broken symlinks with no easy way to fix them. People will rename or move the project directory and will end up with a broken venv. People will still forget to activate venvs, or activation will still fail, and end up with weird errors. None of those problems occur with PEP 582 (assuming installing the older Python version counts as easy).


I agree that the defaults should not require having to learn about virtual envs, or making choices about them. IMO it should just be there (in a standard location, I wouldn’t suggest within a given project folder), unless you want to start renaming your envs etc. but then you can look up how to do that.

Re some of @dstufft’s points:

  • I think docker is special in the sense that image recipes there are already full of special gymnastics to keep the footprint as small as possible, so adding one flag to the python install would not be a big burden IMO (and it would work without that flag)
  • A putative PEP668 + PEP704 should work together in the sense that it should be possible to tell the installer to accept & use the conda env (and conda could probably hook into this by default)
  • Tools that ship their own Python install – this is an advanced case, and such authors can just opt out out of venvs when they package their own Python
  • I think it would be fine to not explicitly support any PATH manipulation things. Those are hacks IMO that should be replaced by better tools/options.

So I think the idea needs to connect a bit more to existing use cases, and polish the UX a bit (less errors, more happy defaults), but I still think it’s the best direction so far.

What does “run an installer” mean, exactly?
With require-virtualenv = true, pip currently allows pip list or pip hash, but not pip freeze, pip download or pip install --target.

Shouldn’t this be limited to systems that actually have a system python? There’s no sense in hassling people who are in complete control of their own install.

How about keeping PIP_REQUIRE_VENV so that the opposite preference can be set with PIP_REQUIRE_VENV=0?

1 Like

The requirement to provide instructions on how to create a virtual environment leaves the question open as to what tool to use. Should we require recommending venv? If a tool recommends virtualenv, how does the user get that? For users using an IDE, why not ask the IDE to make an environment? And as noted elsewhere, what about conda environments?

There is a whole bunch of complexity around creating an environment - do we really want to make it the installer’s responsibility to help new users navigate that?

1 Like

This would be a great improvement!

As @CAM-Gerlach said in maturin we already have this behaviour, and from the replies i haven’t seen a case that goes beyond complaining that it’s inconvenient.

@pradyunsg the rendered link in the first post is outdated, i think it should now be peps/pep-0704.rst at require-venv · pradyunsg/peps · GitHub

One important thing missing is how to detect an activated venv from a cli tool, and subsequently how this mechanism works with both normal venvs and conda. Currently in maturin, we look for VIRTUAL_ENV or CONDA_PREFIX, but it would be nicer to have a common key. We could e.g. have PYTHON_ENV_ROOT which activate and conda set in addition to VIRTUAL_ENV and CONDA_PREFIX. This would also handle a lot of the edge case scenarios which could be handled by just setting the correct PYTHON_ENV_ROOT to declare that this is an isolated (or virtual) environment.

In the PEP, what is considered a package installer? Obviously pip, poetry, etc. count but is maturin develop also included? From my perspective it makes sense to ban modifying non-isolated site packages, except for the case that the user opts in to that.

FWIW i think the long term solution is not having environments at all, but that’s still a bit into the future :​)

The installer SHOULD also provide a mechanism to disable this behaviour, allowing a user to opt-in to running the tool outside of a virtual environment. The installer MAY include this in the error message that it prints, and MUST include it in the documentation page it links to.

If the installer includes a link to a documentation page in the error message described above, the documentation page SHOULD be written with a focus on new users. It should explain what virtual environments are, why they are required, and how to create and activate a virtual environment. It SHOULD include instructions for the most common shells and platforms.

I’d change this to:

The installer MAY also provide a mechanism to disable this behaviour, allowing a user to opt-in to running the tool outside of a virtual environment. We advise tool authors to provide information what virtual environments are, why they are required, and how to create and activate a virtual environment with a focus on new users in error messages and documentation

While i’m not sure if maturin counts as an installer, i think it’s reasonable for tools to not support non-venv at all, and i don’t think error handling and docs should be part of the specification. I’d also add in the “How to Teach This” section that this goes into the 3.13 release notes and that this is a case where tools are meant to point the users to the right docs in case they aren’t using a venv.

In my experience you need venv in docker again anyways for multistage builds. Additionally, python:3.13 can provide a container with an already activated .venv.

From both my own experience and from supporting other python users, learning by having a completely broken/unreproducible global environment is much more painful. I believe the solution is to have tools that abstract the whole venv handling away from the user, as e.g. poetry successfully does with poetry install and poetry run.

Yes, and i think there should be standard text blocks and link that everybody just copies, essentially like:

venv = detect_venv()
if not venv:
   raise RuntimeError(
       "Can't install when not in a virtual environment. "
       "Please create and activate a virtualenv using `python -m venv .venv && source .venv/bin/activate`. "
       "Read more at"

Open Question (and possibly an entry for the Rejected Ideas section): If a cli tool sees a .venv but it isn’t activated, should it just use that anyway (vs. saying “please run source .venv/bin/activate and try again”)

As a user of Conda environments I would find it slightly hostile if I needed to pass a specific flag every time I want to pip install in that context. Of course, perhaps pip can be taught to detect Conda environments, or a well-known tool-agnostic marker can be standardized to recognize environments (of whatever kind).


I’ve never once used a virtual environment in a docker build, even with complex multistage builds, so they’re not hardly required.

It is true that python:3.13 can activate a virtual environment, but that doesn’t solve the larger problem that this PEP assumes that the only isolation mechanism worth caring about is virtualenv / venv, it just means that in one particular case there is a work around (which is effectively “use venv”).

As an additional note, I’d note that unless great care is taken (or changes are made to venv/virtualenv), this also suffers from the “implicitly re-using environments across computers” problem that I mentioned in the PEP 582 thread.


I completely agree that other isolation mechanisms than just venv should be supported! I’d be really interested in talking about the the specifics of python envs in docker (especially since it’s also a problem for maturin), but i don’t want to derail the thread.

That’s a lot harder than it seems. The text you suggested is Unix-specific and won’t work on Windows. And yes, our experience with pip is that users do just copy/paste what the message says, and get upset when it doesn’t work.

The history of the command we suggest people use to upgrade pip is an illuminating, but depressing, example of how hard it is to make suggestions of commands that users can “just type in” :slightly_frowning_face:


And whilst the PR is open, we can see a fully rendered preview at:


5 posts were split to a new topic: Should PEP 704 be a PEP?

Most of the systems I have come across that try to manage multiple virtual environments by name are managed from some global directory (otherwise it’s an ad-hoc situation or it’s a tool that hides virtual environments from you like Nox and tox).

That’s what the Python Launcher for Unix does and works out rather nicely in my opinion. Since this PEP doesn’t talk about execution and only installation, I don’t think it has any bearing on this specific case beyond helping standardize on virtual environment names.

I think that’s prescribing a specific motivation a bit too much. Pradyun may also be trying to the force some decision be made about virtual environments by providing an alternative to PEP 582, but not necessarily favouring one over the other.

I think it’s reasonable to make sure this PEP is phrased as a recommendation for installers that we think should be followed.

I personally see three three levels of possibility here:

  1. What this PEP proposes (but potentially toned down to a recommendation), which is require an environment by default with some sort of opt-out (e.g. PIP_REQUIRE_VIRTUALENV=1 is the assumed default).
  2. If a virtual environment is …
    a. … found and is unambiguous, use it automatically (i.e. sort of like implicit environment activation).
    b. … found and it’s ambiguous which virtual environment to use, error out.
    c. … not found, continue doing what happens today and lean on PEP 668.
  3. Status quo

Option 2 would require standardizing on virtual environment locations which I’m selfishly an advocate for since, from a tools perspective, it’s just a mess to try and support the myriad of places people put their virtual environments. I have Support a way for other tools to assist in environment/interpreter discovery · Discussion #168 · brettcannon/python-launcher · GitHub for this, but for more common, simpler situations we could try to handle via standards instead of dynamically executed code.

1 Like

Thanks for the lively discussion here folks, and for lots of feedback both here and on the PR! (I’d appreciate if we could keep the non-PEP-editor feedback here. :wink:)

I’ve gone ahead and relaxed the PEP’s language to recommendations. I’ve also updated the links in OP to point to both the PEP as-written as well as rendered; and provided the eventual URL for it.[1] I’ve also clearly separated what I view as implementation details vs this PEP’s recommendations into separate sections.

I’ve gone ahead and improved the language in the PEP, to better reflect how it intends to resolve this aspect.

This is a direction I’d happily take this PEP, if we can’t reach concensus that we can unambiguosly have isolated environments detected.

I’ve requested that moderators split the discussion on whether this is a valid “Packaging PEP” discussion to a separate thread. I do have additional thoughts and opinions, but I’d prefer to discuss those separately – if the conclusion of that thread ends up being that we shouldn’t use the PEP process for discussing ecosystem-affecting UX design decisions, then withdrawing this PEP while documenting the reasons in the PEP as a top-of-PEP withdrawal notice is 100% OK with me.

  1. Perhaps posting it for discussion prior to letting it go through PEP editors’ copy edits wasn’t the best move in hindsight. :slight_smile: ↩︎

The discussion has been focused mainly on pip, but the PEP talks about “installers” in general. Does your view change if I ask you whether you would be happy implementing this functionality in python -m installer <path_to_wheel>? That’s not a troll, it’s a genuine question - we really do have multiple installers these days, and it’s a good sanity check on whether a proposal is saying “pip should do X” or “installers should do X”.

No, for the same reason that it won’t implement PEP 668’s suggested behaviours for “Python-specific package installer (that is, a tool such as pip - not an external tool such as apt)”.

The key difference is that it’s not meant to be a user-facing tool like pip, mousebender, etc are intended to be, and is primarily a shared implementation of unpacking wheels (the hope is that it’ll gain parity with pip and pip would switch to it) and the CLI is meant for “low level” use (i.e. setting up your Linux distro’s lowest layers before you have pip).

I share most of the concerns brought up above, particularly :

At the end of the day, if import resolves to something, then foo is “installed” independent of how the import mechanism discovered it, how the those files (if any!) ended up on disk, or if there is any metadata files floating around. While pip and venv are obviously very privileged implementations of how to install packages and isolate things given their relation to core Python, they are by far not the only, and in many cases not the best solutions, to these problems. Please be careful that these privileged tools do not become overly indexed for the kinds of use cases that their maintainers happen to have.

Related, this proposal seems to take as given that in-tree virtual envs are the best (or at least sufficiently consensus best) option and should be suggested as the “standard”.

This pattern prevents having multiple envs with different versions of Python/dependencies/both for the same project because it picks a privileged name / location. Further, it makes it very awkward (particularly coupled with auto-activate / discovery based on cwd) to work on multiple projects that interact with each other (e.g. multiple libraries that depend on each other or a library and an application that uses it).

I do not think these are “fringe” or “advanced” use cases. In the sciences a very common pattern, and one that is being actively encouraged, is to put the reusable parts of the analysis into a library and have a separate “application” which uses it for the (embargoed) science analysis. In my now ~15 years of using Python, I am not sure I was ever in a situation where in-tree was a suitable, let alone ideal, solution.

That in-tree venvs is encouraged by some tools as also lead to pain in downstream projects when users have checked the whole venv in and required re-writing history (see Minor re-writing of history and moving from master to main for default branch - Development - Matplotlib) and any search for a function from Matplotlib on GitHub to estimate actual usage is completely spoiled by the many (many) people who have committed the whole venv to their hello-world project.

A better solution is for the pip that conda ships to simply patch out this behavior (or at least change the default)


My concern with this is that people would update pip with pip, in a Conda environment, following the instructions that pip prints.


Can you clarify this bit? If the shared code is in a library, then aren’t you installing it for your application? And if so, can’t you install it for each application? Is the convenience of not installing it per application what you suggesting by having it all in a single environment?

In my 20 years I have constantly found it suitable, and from a tooling perspective, ideal. :wink: But I’m going to start up a separate topic to see if we can’t find some solution that works for more use cases (including yours).