I agree with @petersuter here. This does not solve any of the problems with virtual environments. People will still upgrade their OS and end up with a broken venv with broken symlinks with no easy way to fix them. People will rename or move the project directory and will end up with a broken venv. People will still forget to activate venvs, or activation will still fail, and end up with weird errors. None of those problems occur with PEP 582 (assuming installing the older Python version counts as easy).
I agree that the defaults should not require having to learn about virtual envs, or making choices about them. IMO it should just be there (in a standard location, I wouldnāt suggest within a given project folder), unless you want to start renaming your envs etc. but then you can look up how to do that.
Re some of @dstufftās points:
- I think docker is special in the sense that image recipes there are already full of special gymnastics to keep the footprint as small as possible, so adding one flag to the python install would not be a big burden IMO (and it would work without that flag)
- A putative PEP668 + PEP704 should work together in the sense that it should be possible to tell the installer to accept & use the conda env (and conda could probably hook into this by default)
- Tools that ship their own Python install ā this is an advanced case, and such authors can just opt out out of venvs when they package their own Python
- I think it would be fine to not explicitly support any
PATH
manipulation things. Those are hacks IMO that should be replaced by better tools/options.
So I think the idea needs to connect a bit more to existing use cases, and polish the UX a bit (less errors, more happy defaults), but I still think itās the best direction so far.
What does ārun an installerā mean, exactly?
With require-virtualenv = true
, pip currently allows pip list
or pip hash
, but not pip freeze
, pip download
or pip install --target
.
Shouldnāt this be limited to systems that actually have a system python? Thereās no sense in hassling people who are in complete control of their own install.
How about keeping PIP_REQUIRE_VENV
so that the opposite preference can be set with PIP_REQUIRE_VENV=0
?
The requirement to provide instructions on how to create a virtual environment leaves the question open as to what tool to use. Should we require recommending venv
? If a tool recommends virtualenv, how does the user get that? For users using an IDE, why not ask the IDE to make an environment? And as noted elsewhere, what about conda environments?
There is a whole bunch of complexity around creating an environment - do we really want to make it the installerās responsibility to help new users navigate that?
This would be a great improvement!
As @CAM-Gerlach said in maturin we already have this behaviour, and from the replies i havenāt seen a case that goes beyond complaining that itās inconvenient.
@pradyunsg the rendered link in the first post is outdated, i think it should now be https://github.com/pradyunsg/peps/blob/require-venv/pep-0704.rst
One important thing missing is how to detect an activated venv from a cli tool, and subsequently how this mechanism works with both normal venvs and conda. Currently in maturin, we look for VIRTUAL_ENV
or CONDA_PREFIX
, but it would be nicer to have a common key. We could e.g. have PYTHON_ENV_ROOT
which activate
and conda
set in addition to VIRTUAL_ENV
and CONDA_PREFIX
. This would also handle a lot of the edge case scenarios which could be handled by just setting the correct PYTHON_ENV_ROOT
to declare that this is an isolated (or virtual) environment.
In the PEP, what is considered a package installer? Obviously pip, poetry, etc. count but is maturin develop
also included? From my perspective it makes sense to ban modifying non-isolated site packages, except for the case that the user opts in to that.
FWIW i think the long term solution is not having environments at all, but thatās still a bit into the future :ā)
The installer SHOULD also provide a mechanism to disable this behaviour, allowing a user to opt-in to running the tool outside of a virtual environment. The installer MAY include this in the error message that it prints, and MUST include it in the documentation page it links to.
If the installer includes a link to a documentation page in the error message described above, the documentation page SHOULD be written with a focus on new users. It should explain what virtual environments are, why they are required, and how to create and activate a virtual environment. It SHOULD include instructions for the most common shells and platforms.
Iād change this to:
The installer MAY also provide a mechanism to disable this behaviour, allowing a user to opt-in to running the tool outside of a virtual environment. We advise tool authors to provide information what virtual environments are, why they are required, and how to create and activate a virtual environment with a focus on new users in error messages and documentation
While iām not sure if maturin counts as an installer, i think itās reasonable for tools to not support non-venv at all, and i donāt think error handling and docs should be part of the specification. Iād also add in the āHow to Teach Thisā section that this goes into the 3.13 release notes and that this is a case where tools are meant to point the users to the right docs in case they arenāt using a venv.
In my experience you need venv in docker again anyways for multistage builds. Additionally, python:3.13
can provide a container with an already activated .venv
.
From both my own experience and from supporting other python users, learning by having a completely broken/unreproducible global environment is much more painful. I believe the solution is to have tools that abstract the whole venv handling away from the user, as e.g. poetry successfully does with poetry install
and poetry run
.
Yes, and i think there should be standard text blocks and link that everybody just copies, essentially like:
venv = detect_venv()
if not venv:
raise RuntimeError(
"Can't install when not in a virtual environment. "
"Please create and activate a virtualenv using `python -m venv .venv && source .venv/bin/activate`. "
"Read more at https://packaging.python.org/en/latest/tutorials/installing-packages/#creating-virtual-environments"
)
Open Question (and possibly an entry for the Rejected Ideas section): If a cli tool sees a .venv
but it isnāt activated, should it just use that anyway (vs. saying āplease run source .venv/bin/activate
and try againā)
As a user of Conda environments I would find it slightly hostile if I needed to pass a specific flag every time I want to pip install
in that context. Of course, perhaps pip
can be taught to detect Conda environments, or a well-known tool-agnostic marker can be standardized to recognize environments (of whatever kind).
Iāve never once used a virtual environment in a docker build, even with complex multistage builds, so theyāre not hardly required.
It is true that python:3.13
can activate a virtual environment, but that doesnāt solve the larger problem that this PEP assumes that the only isolation mechanism worth caring about is virtualenv
/ venv
, it just means that in one particular case there is a work around (which is effectively āuse venvā).
As an additional note, Iād note that unless great care is taken (or changes are made to venv/virtualenv), this also suffers from the āimplicitly re-using environments across computersā problem that I mentioned in the PEP 582 thread.
I completely agree that other isolation mechanisms than just venv should be supported! Iād be really interested in talking about the the specifics of python envs in docker (especially since itās also a problem for maturin), but i donāt want to derail the thread.
Thatās a lot harder than it seems. The text you suggested is Unix-specific and wonāt work on Windows. And yes, our experience with pip is that users do just copy/paste what the message says, and get upset when it doesnāt work.
The history of the command we suggest people use to upgrade pip is an illuminating, but depressing, example of how hard it is to make suggestions of commands that users can ājust type inā
And whilst the PR is open, we can see a fully rendered preview at:
5 posts were split to a new topic: Should PEP 704 be a PEP?
Most of the systems I have come across that try to manage multiple virtual environments by name are managed from some global directory (otherwise itās an ad-hoc situation or itās a tool that hides virtual environments from you like Nox and tox).
Thatās what the Python Launcher for Unix does and works out rather nicely in my opinion. Since this PEP doesnāt talk about execution and only installation, I donāt think it has any bearing on this specific case beyond helping standardize on virtual environment names.
I think thatās prescribing a specific motivation a bit too much. Pradyun may also be trying to the force some decision be made about virtual environments by providing an alternative to PEP 582, but not necessarily favouring one over the other.
I think itās reasonable to make sure this PEP is phrased as a recommendation for installers that we think should be followed.
I personally see three three levels of possibility here:
- What this PEP proposes (but potentially toned down to a recommendation), which is require an environment by default with some sort of opt-out (e.g.
PIP_REQUIRE_VIRTUALENV=1
is the assumed default). - If a virtual environment is ā¦
a. ā¦ found and is unambiguous, use it automatically (i.e. sort of like implicit environment activation).
b. ā¦ found and itās ambiguous which virtual environment to use, error out.
c. ā¦ not found, continue doing what happens today and lean on PEP 668. - Status quo
Option 2 would require standardizing on virtual environment locations which Iām selfishly an advocate for since, from a tools perspective, itās just a mess to try and support the myriad of places people put their virtual environments. I have Support a way for other tools to assist in environment/interpreter discovery Ā· brettcannon/python-launcher Ā· Discussion #168 Ā· GitHub for this, but for more common, simpler situations we could try to handle via standards instead of dynamically executed code.
Thanks for the lively discussion here folks, and for lots of feedback both here and on the PR! (Iād appreciate if we could keep the non-PEP-editor feedback here. )
Iāve gone ahead and relaxed the PEPās language to recommendations. Iāve also updated the links in OP to point to both the PEP as-written as well as rendered; and provided the eventual URL for it.[1] Iāve also clearly separated what I view as implementation details vs this PEPās recommendations into separate sections.
Iāve gone ahead and improved the language in the PEP, to better reflect how it intends to resolve this aspect.
This is a direction Iād happily take this PEP, if we canāt reach concensus that we can unambiguosly have isolated environments detected.
Iāve requested that moderators split the discussion on whether this is a valid āPackaging PEPā discussion to a separate thread. I do have additional thoughts and opinions, but Iād prefer to discuss those separately ā if the conclusion of that thread ends up being that we shouldnāt use the PEP process for discussing ecosystem-affecting UX design decisions, then withdrawing this PEP while documenting the reasons in the PEP as a top-of-PEP withdrawal notice is 100% OK with me.
Perhaps posting it for discussion prior to letting it go through PEP editorsā copy edits wasnāt the best move in hindsight. ā©ļø
The discussion has been focused mainly on pip, but the PEP talks about āinstallersā in general. Does your view change if I ask you whether you would be happy implementing this functionality in python -m installer <path_to_wheel>
? Thatās not a troll, itās a genuine question - we really do have multiple installers these days, and itās a good sanity check on whether a proposal is saying āpip should do Xā or āinstallers should do Xā.
No, for the same reason that it wonāt implement PEP 668ās suggested behaviours for āPython-specific package installer (that is, a tool such as pip - not an external tool such as apt)ā.
The key difference is that itās not meant to be a user-facing tool like pip, mousebender, etc are intended to be, and is primarily a shared implementation of unpacking wheels (the hope is that itāll gain parity with pip and pip would switch to it) and the CLI is meant for ālow levelā use (i.e. setting up your Linux distroās lowest layers before you have pip
).
I share most of the concerns brought up above, particularly :
At the end of the day, if import foo.bar
resolves to something, then foo
is āinstalledā independent of how the import mechanism discovered it, how the those files (if any!) ended up on disk, or if there is any metadata files floating around. While pip
and venv
are obviously very privileged implementations of how to install packages and isolate things given their relation to core Python, they are by far not the only, and in many cases not the best solutions, to these problems. Please be careful that these privileged tools do not become overly indexed for the kinds of use cases that their maintainers happen to have.
Related, this proposal seems to take as given that in-tree virtual envs are the best (or at least sufficiently consensus best) option and should be suggested as the āstandardā.
This pattern prevents having multiple envs with different versions of Python/dependencies/both for the same project because it picks a privileged name / location. Further, it makes it very awkward (particularly coupled with auto-activate / discovery based on cwd) to work on multiple projects that interact with each other (e.g. multiple libraries that depend on each other or a library and an application that uses it).
I do not think these are āfringeā or āadvancedā use cases. In the sciences a very common pattern, and one that is being actively encouraged, is to put the reusable parts of the analysis into a library and have a separate āapplicationā which uses it for the (embargoed) science analysis. In my now ~15 years of using Python, I am not sure I was ever in a situation where in-tree was a suitable, let alone ideal, solution.
That in-tree venvs is encouraged by some tools as also lead to pain in downstream projects when users have checked the whole venv in and required re-writing history (see Minor re-writing of history and moving from master to main for default branch - Development - Matplotlib) and any search for a function from Matplotlib on GitHub to estimate actual usage is completely spoiled by the many (many) people who have committed the whole venv to their hello-world project.
A better solution is for the pip
that conda ships to simply patch out this behavior (or at least change the default)
My concern with this is that people would update pip with pip, in a Conda environment, following the instructions that pip prints.
Can you clarify this bit? If the shared code is in a library, then arenāt you installing it for your application? And if so, canāt you install it for each application? Is the convenience of not installing it per application what you suggesting by having it all in a single environment?
In my 20 years I have constantly found it suitable, and from a tooling perspective, ideal. But Iām going to start up a separate topic to see if we canāt find some solution that works for more use cases (including yours).