I came across this PEP a few weeks ago and thought it was really interesting. I wrote a blog post about it and a proof of concept called pythonloc. Overall I like the idea, but I think some minor implementation changes can have big changes on how useful and adopted this feature becomes. I wanted to share some of my feedback here.
Falling back to
site-packages reduces usefulness of
If I am developing an app or library and I want to test it, I need to know exactly which packages are available. For example, if I blow away
__pypackages__, then run
pip install . and try to run my code, I cannot guarantee all my dependencies came from that
pip install .; some could have been in the fallback location of
So because I am not 100% certain which libraries are being used, I basically can’t use it for development. Almost everyone, including tools like Pipenv and poetry, will still have to use virtual environments to guarantee deterministic package resolution.
I suggest only searching
__pypackages__ if that directory is found. If that is done, then many more people can stop using virtual environments if they choose to.
Namespacing of packages in
__pypackages__ does not include OS
One of the really nice things about this is that the directory structure can be copied directly and run on different machines. However, if code gets copied from a windows machine to linux server and attempted to run, it may fail (with cryptic error messages) due to OS differences. I suggest namespacing on Python version as well as os, such as
Running binaries or “scripts”
When installing a package with entry points, they get buried somewhere in
executes either from a local node_modules/.bin, or from a central cache, installing any packages needed in order for to run.
I updated pipx to work similarly (i.e.
pipx run flake8, or to only search
__pypackages__ and not a temporary installation,
pipx run --pypackages flake8). pipx doesn’t have to be the only solution, but a simple program that can determine the expected local
bin/ dir can fill this function. For example, poetry or Pipenv could have this functionality added.
The PEP is a little hand-wavey on how this will work and mentions pip adapting to it.
After doing a fresh check out the source code, a tool like pip can be used to install the required dependencies directly into this directory.
In another example scenario, a trainer of a Python class can say “Today we are going to learn how to use Twisted! To start, please checkout our example project, go to that directory, and then run python3 -m pip install twisted.”
In theory, this sounds great.
pip will be modified to install to
__pypackages__ (by default?). And if a user wants to install to
site-packages or their user dir, they can use appropriate flags. However, this is a big change in behavior for pip. Are pip maintainers on board with this? The adoption of the
__pypackages__ convention depends not only on Python running from that directory but on how easy it is to manipulate packages in it too.
Along the the lines of the last section, being able to create a lockfile of everything in
__pypackages__ would be incredibly useful to the community. I’m not sure this is possible with current tooling, but this use case should definitely be considered.