Should requirements.txt for a project package include that package itself?

I’m trying to come up with a workflow for developing python packages. I’m working on a team with a shared collection of git repositories and a local pip repository. The typical flow is that we work on packages via the git repo, and then eventually the package is posted to the pip repo. This question concerns the time BEFORE the package is available on the pip repo.

So suppose I’m working on a package which has some dependencies. As I’m developing I may be relying on developing features of OTHER packages, so I will git clone those packages and install them into my virtual environment using pip install -e .. Often these packages use so that they can be pip installed.

The best way I’ve found to share my virtual environment with peer developers so that they can test my code is pip freeze > requirements.txt Now when I do this, the package that I’m working on will be included in the requirements.txt file which I include in the root directory for my package. But, since my package is installed with the -e flag into the virtual environment, it shows up as a direct url dependency with the -e flag. This is fine, but if I now commit the package repo with the changes due to the new requirements.txt file the repo has changes to the requirements.txt file that now need to be committed…

What is the proper way around this recursive problem I’ve gotten myself into? My current strategy is to EXCLUDE the package itself from requirements.txt (which requires me manually pruning it after requirements.txt is created) and then inform peer developers that the should pip install -r requirements.txt then also pip install -e . on the package of interest.

Is there a better way?

Relatedly but a separate question: is it possible in the install_requires sections to specify that certain dependencies should be installed in editable mode, the same way you can do it for requirements.txt? I know direct url dependencies are possible in, but that’s not what I’m looking for.

I will not claim to have understood fully what your workflow is, but I think that maybe one of these tools can help:

I believe using only pip, its freeze command and its requirements.txt file format has its limits indeed, and

No. Maybe it is technically possible, I do not recall. But I would strongly argue against this. The list of dependency requirements in the package metadata (which’s install_requires is the input of) is meant for abstract dependencies. I believe adding the editable qualifier would make it a “concrete dependency”. Read more in the install_requires vs requirements files” article.

Those packages seem interesting, and like a lot of work to sift through to learn if they will actually help with my problem. But one direct question: Are these tools that both me AND my collaborators would need to adopt? Or can my “sharing code” workflow be improved if I adopt one of these tools while my collaborators change nothing about their tooling?

As far as I know, these tools would need adoption by the whole team of people participating in the development of the project (the library, the Python package).

Maybe pip-tools would be the only one that would require little to no changes for participants, and maybe even no change at all for the people not directly involved in the development of the project but depend on the project, because pip-tools works with requirements.txt files and does not try to take over any other parts of the development workflow (it is not a build back-end, not a dependency manager, not a virtual environment manager).