Correct me if I’m wrong, but isn’t installing Python through the windows store the preferred method now? That version comes with its own version of pip as well.
Anyway, if the venv set itself up and activated itself automatically I would finally use them. I rarely touch them right now because of the extra steps required to use them when changing projects or swapping servers, so the easier the setup and fewer steps to use the better IMO.
Those features would seem like a shortcut to accomplishing the features proposed in PEP 582 too, I’m a fan of that one for the same reasons. I don’t take pleasure in having to work in the JS ecosystem, but I do think the default behavior of their package manager is a strong upside.
IIUC it’s not the “preferred” method, but simply another method like others. But either way, the pip bundled in the Store distribution is exactly the same as what you get in the Windows installer, so the situation is the same, users of the Store distribution still need pip to manage the Store-installed Python, and PIP_REQUIRE_VIRTUALENV breaks that workflow.
Others in this thread have alluded to the idea that there exists a number of legitimate usecases where requiring a virtualenv is some combination of:
Not needed
Not important or valuable enough to the business to spend time worrying about
Already maintained automagically by pre-existing tools
Abstracted away by Docker/Conda/Infrastructure As Code tools
Breaking a significant amount of backwards compatibility
Outside of those practical bullet points, I think there is additional (admittedly opinion-based, but still worthwhile) discussion surrounding how “Pythonic” it is to opt-in users to new functionality, and whether or not that new functionality imposes unnecessary rigidity or restriction.
IMBO this proposal, while definitely good-spirited, feels like many of the discussions surrounding topics like static type checking, and whether or not “one size should fit all.” Flexibility, even to the point of breaking things, is a founding tenant of the language, no?
I looked into environment detection recently, and I couldn’t find any way to reliably detect environments made by different tools (virtualenv, conda, pyenv). I believe all these kinds of environments should be treated similarly from a user perspective, so I’m generally against adding any further behaviour that relies on testing ‘am I inside an environment?’
This is why the new automatic switch to --user installation looks at permissions, not environments.
It’s tempting to propose some standard ‘environment marker’ for all of these systems, but I think that’s answering the wrong question. What we care about is having an isolated set of packages that won’t interfere with anything else - but a non-environment Python (e.g. in a container) can be sufficiently isolated, while you can have one environment which is used for many things (e.g. a centrally managed conda environment on an HPC cluster). It’s not easy to automatically work out ‘will installing a package here interfere with other things?’
For the Python extension of VS Code we have custom logic for every tool we support.
I was thinking the same idea this morning and came to the same conclusion. Really the best we can have is each tool clearly defining and documenting detection and activation requirements.