If there is to be some change to py or a new command-line entry point for running Python then that would be a good opportunity to fix some of the other beginner pain points. Specifically at the point when it actually is time to use the terminal:
What exactly is the command line invocation for Python or for pip that can work correctly after a standard install for all operating systems (python, python3, py, pip, pip3, py -m pip, …)?
Can there be a correct command line invocation that is allowed to be autoconfigured to be on PATH so that users don’t need to edit PATH manually when using default install settings?
The various proposals discussed present opportunities to make this situation better as well as the risk of making it worse by introducing yet more incompatible different ways to run Python.
The invocation or configuration of python in different OS is not part of this PEP or shouldn’t be (I’m not saying they are good or bad suggestions).
This is only an optional feature of PIP of introducing a local installation of the packages and no impact to the actual behavior of python or the ecosystem (like other packages or distros).
That’s the fundamental risk I see here, too. I mentioned it some time ago.
At this point, I’m afraid we’re simply going round in circles. The two most likely ways we’ll break the logjam are
The PEP gets updated to address the issues raised here, and we can have another round of discussion based on a revised proposal.
Someone implements a solution that doesn’t require co-operation from core Python or from other packaging tools, and hence doesn’t need a PEP.
But until one or the other of those happens, I don’t see much chance that anything will change.
Wrong. If it were only pip that needed to change, this wouldn’t need a PEP. It needs core Python to change to add __pypackages__ to sys.path. And it needs other tools (for example IDEs like VS Code) to recognise and work with the new scheme.
Thanks for your feedback! I don’t think this feature is meant to replace anything. It provides support to a common practice and also a simple and very practical solution for several use cases.
For instance in cases you want to share it or to deploy to a server/container that you don’t have permission to install anything globally without the need to run additional commands
I think Brett also mentioned other very good examples. So I think there’s a real need for this.
You really need this to be something that works immediately after Python is installed and that works in the same way for all standard installs. Although it might sound like a 3rd party executable like pyx or something could solve this problem it would only make things more confusing if it was not always installed or if it ended up having different names in different OSes. Changing py could maybe work but it needs to be made so that py is guaranteed to be on PATH and to exist on every OS.
I would say that without core Python buy in it’s not worth doing anything. In fact for this to really work there needs to be buy in from Linux distros and Windows as well to be clear that no platform should go its own way and try to do something different. I’m assuming here that OSX is okay because there the Python installers can just add the executable and setup PATH but it doesn’t work that way for Windows and Linux.
If you’re thinking that most Python beginners don’t use Linux then that’s probably true but a lot of the people who write the instructions for how to do anything with Python do use Linux. It needs to be clear what the recommended basic way of doing things is or otherwise conflicting advice will continue to proliferate.
There is an alternative: you could have the same tool manage both your python install + your python packages. So you download this other tool, and it’s still only one thing to install, but it has whatever convenient UX that you want, instead of making the long-standing python entry point do everything.
That sounds good but fleshing the idea out a bit how would this work on all possible platforms and installs?
I can see that potentially working on Windows similar to the existing python.exe that is shipped in Windows except that it is provided to install as well as run Python from the command line. I imagine that having it download Python and Python packages directly might run into security concerns though (presumably that’s why python.exe is limited to the MS app store) but maybe there’s a way round that.
If I’ve installed everything from Anaconda would I have the same entry point that this tool provides even if I’m not using it to install Python?
Would Homebrew provide the entry point?
Would the entry point exist in a basic install of a Linux distro?
Maybe some problems are not solvable. Needing to set some environment variable at the time of installing Python or tick some option in the installer is unfortunate but can be tolerated especially since most beginners don’t immediately need the command line. The real problem right now is that even after things are installed and correctly configured the steps from there for running Python, installing packages, activating environments etc are different for every different setup and there are so many ways to get it wrong by being confused about which Python environment is used in any particular context.
As much as I used to see people struggling to understand which Python or pip etc is being used when they type terminal commands I now see exactly the same problem but for selecting Python installs inside e.g. vscode. The difference now is that I don’t understand how vscode (or whatever IDE) works or exactly how it decides which Python to use.
In my opinion, as a teacher, coach and developer, I am not convinced this would make it easier for students.To the contrary, it would be just one more thing to teach and learn. Venvs oth have a strong footing in industry practice and there are other languages that use similar mechanisms, i.e. centralized library locations that are referenced by setting up a project’s runtime environment.
Also I think moving the package location into the project source tree is a violation of conceptually seperate concerns, namly “my code” vs “third party” dependencies. Venvs make this explicit.
Venvs in my experience are misunderstood because they are rarely introduced as a pre-requisite but an afterthought. We should teach venvs as a key idea to create separated namespaces*) for different projects. If you start like this, it is easier to understand that the first thing to do for a new project is to create a venv.
*) “Namespaces are one honking great idea. Let’s have more of those” (import this)
users not knowing where is best b/c of no docs/recommendations so .venv is often chosen
mature wrappers like tox that went with e.g. .tox and many copied that like nox
~/.virtualenvs is commonly shared and known to IDEs but if not using a wrapper is more verbose to manage for users e.g. virtualenv ~/.virtualenvs/<project_name>; . ~/.virtualenvs/<project_name>/bin/activate
This is getting OT, but it’s also ambiguous if you don’t name the project after the directory the environment is meant for. It also leads to environment reuse which is a good/bad thing if you do (not) know what you’re doing.
No updates. Either consensus has to be reached (which it hasn’t) and/or someone needs to send this to the SC to make a decision (although it is a weird PEP in that it also impacts packaging, so it isn’t even clear who would get final say).
I am still asking around people who teaches to try this out in their classroom, sadly none of them are active in upstream. I am still dedicated to this and want to make sure this goes in. I am just a bit lost and everyone has million different ideas (not implementations), and not too many to actually support this one.
As a teacher of Python beginners, I disagree – this is one more level of complication that really isn’t needed to get started. (I did teach an intro class with this philosophy a few years back, and it did not go well) My experience is that beginners are confused enough trying to keep track of how their IDE rund python, which version gets run, etc, etc.
And while some kind of isolated environment system (I use conda…) is critical for a lot of production work – it is not at all necessary to have a new environment for every “project”, if a project is a simple script, etc.
In fact, some tutorials for, e.g. web frameworks, start out with creating a venv – and I’ve seen students get very confused byt htat, ending up making a new one every single time they start work in the project. – Yes, if they had been taught about virtualenvs from day one, they probably wouldn’t have had that confusion, but instead they would have had a much harder time writing their first little script.
Finally, another good reason to keep it out of the intro is that there isn’t one venv system – I think it’s very helpful to make it clear what is Python, what is a particular IDE, what is particular environment system, etc.
I am still excited about PEP 582 pypackages. venvs are an ugly hack, but we have grown accustomed to them over time. We benefit from a system that reifies the application written in Python in contrast to the Python interpreter that happens to have a set of packages installed inside it.
I’m gonna have to disagree with your disagreement.
PHP, nodejs, Ruby, etc… they implement this. No one ever has been confused about how it works, not sure why your students should.
The advantages are very obvious you can see in the folder inside of the project you are working on what dependencies has and the source code of it. Also is great for IDEs because they know where to look for the packages (No more guessing) and the most important of all, it does not mess with your system, so your system won’t break because you install some packages when you forgot to install it in the terminal you had the venv on.
The only downside is requires a minor change in the python codebase