For decades, the canonical way to install a Python package system wide has been
python setup.py install
On most Linux systems this would install the package in the site-packages directory below /usr/local. Just like any other typical non-package-manager install.
For most sysops the strategy is to install everything available as a native system package through yum/pacman/apt and then the handful of extra packages not available as a native system package by installing into /usr/local.
With the recent changes in setuptools however, we now get a warning when calling setup.py install:
SetuptoolsDeprecationWarning: setup.py install is deprecated. Use build and pip and other standards-based tools.
If we follow the recommendation to use pip, by running pip install . we get another warning
WARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv
What should I recommend to my users to achieve a traditional /usr/local install without confusing them with lots of warnings?
Pip v22.1 added a command-line flag (--root-user-action=ignore) to pip install to silence the root-user warning, however the recommendation for all users is to heed the warning and not install packages system-wide.
The reason is that installing system-wide has always conflicted with packages installed by the system’s package manager (eg apt, yum) due to Python’s use in the system’s internals and user-installed packages shadowing system-installed ones, potentially causing the system from working properly.
One suggestion I have for multi-user installs is to install packages under a sys-admin controlled prefix (eg --prefix /srv/my-company) and then add that to each user’s PYTHONPATH (eg PYTHONPATH=/srv/my-company/lib/python3.10/site-packages:$PYTHONPATH). For user installs, just use --user or a virtual-env.