I’m trying to come up with a workflow for developing python packages. I’m working on a team with a shared collection of git repositories and a local pip repository. The typical flow is that we work on packages via the git repo, and then eventually the package is posted to the pip repo. This question concerns the time BEFORE the package is available on the pip repo.
So suppose I’m working on a package which has some dependencies. As I’m developing I may be relying on developing features of OTHER packages, so I will git clone those packages and install them into my virtual environment using pip install -e .
. Often these packages use setup.py
so that they can be pip installed.
The best way I’ve found to share my virtual environment with peer developers so that they can test my code is pip freeze > requirements.txt
Now when I do this, the package that I’m working on will be included in the requirements.txt file which I include in the root directory for my package. But, since my package is installed with the -e
flag into the virtual environment, it shows up as a direct url dependency with the -e
flag. This is fine, but if I now commit the package repo with the changes due to the new requirements.txt file the repo has changes to the requirements.txt file that now need to be committed…
What is the proper way around this recursive problem I’ve gotten myself into? My current strategy is to EXCLUDE the package itself from requirements.txt (which requires me manually pruning it after requirements.txt is created) and then inform peer developers that the should pip install -r requirements.txt
then also pip install -e .
on the package of interest.
Is there a better way?
Relatedly but a separate question: is it possible in the install_requires
sections to specify that certain dependencies should be installed in editable mode, the same way you can do it for requirements.txt? I know direct url dependencies are possible in setup.py, but that’s not what I’m looking for.