Hi there :)!
I’ve asked about this topic before on IRC, but did not stay long enough to get an answer :).
A little bit of background: Most of the python projects I maintain are websites. In order to make upgrades on the hosts easy I bundle these as deb packages for Debian. In my daily workflow I write some code, commit the changes, create a new git tag and push all of it to a Gitlab repository, where the CI will create the deb packages and automatically publish them to a private deb repository. This makes it relatively straight-forward for me and others to release and deploy new versions of the software. A nice side-effect is that I can define dependencies on packages in Debian that provide a lot of the common python libraries.
Unfortunately not all of the libraries, that I depend on, are available in Debian and sometimes the packaged versions are not recent enough for my needs. In these cases I fallback to dh-virtualenv that builds virtualenvs as deb packages and installs dependencies via pip at build time. Every library that is not available in Debian or that does not match the required version range will be packaged in one large deb which I depend on in my primary project deb package.
Some time ago I moved from a
requirements.txt with dependencies to setuptools’
install_requires in a pretty standard
setup.py file. My
requirements.txt only contains
. now. This basically works fine, but I noticed that my own project ends up in the virtualenv that dh-virtualenv generates. That was unexpected and not what I wanted, though I understand what’s happening as I have inadvertently changed the semantics from install these packages to install my package including dependencies. I’ve found several suggestions to use the
-e flag for pip install, but as far as I can see it is advertised as “development mode” so I’m not sure it’s a good fit, as the virtualenv is used in production. It seems like a
--only-deps to stick with
--no-deps naming scheme) option for pip would be a helpful addition to solve my use case which has already been discussed in issue #4783 and the associated pull request #5950. Unfortunately the original authors of these issues lost interest or lacked a specific use case.
- Is that a use case that would justify the inclusion of such an option?
- Are there any other good methods to prevent the installation of my own package in the virtualenv, which I have overlooked?
I also noticed that executing
python3 setup.py shows a
--requires option that is supposed to print the list of packages/modules required, but which prints nothing and apparently does not do what I thought it would (maybe a bug?). I would’ve/could’ve used it to generate a
requirements.txt on-the-fly. I tested it with Python 3.8.2/3.7.3 and setuptools 46.1.3/40.8.0.
Any feedback or remarks are very much appreciated.
tl;dr: When switching from deps in
requirements.txt to deps in
setup.py I inadvertently started to install my own package inside a virtualenv that is shipped to production. How can I make pip install only dependencies but not my own package?