Ways to distribute package constraints amongst the team (org unit)

I’d like to brainstorm some ideas of distributing constraints to members of a team (org unit). Existing solutions seems unsatisfactory leading to preference for simpler but poorer practice.

In my team we develop many private Python libraries distributed as packages. We also work with other teams where we depend on their developing packages. When actively developing, say we’re in the process of minimal viable product, we’d like all our members (developers and others) to have the same experience when they install our packages—that way they have a stable baseline to develop, test, and experiment with. There is no central application that suers should install, some users may want just one of the many, multiple, or all packages. Our users typically use packages as libraries, not applications.

In short we’d like to avoid the problem it works in my machine when installing our packages.

Efforts here would involve constraining the version of top-level and transitive dependencies that our users can install. For simplicity let’s assume all users use and are limited to pip install. With this assumption, I see we have two categories of constraining:

  1. Without modifying pip install the immediate solution is to hard pin the dependency in pyproject.toml. Some are familiar that this very poor practice causing issues to both downstream users and package developers.
  2. Modifying pip install with flags, we have a many options: requirement/lock files, constraint files, wheelhouses, etc.

When it comes to needing constraints for multiple packages/repositories that my team is responsible for, the most apt solution[1] is to distribute constraint files accessible to all members then migrating everyone to pip install -c 'scheme://somefileserver/path/to/constraint.txt'. This solution has a particular nuisance: the user needs read (devs need write) access to where the constraint file is stored; if HTTP auth required, then credentials must always be part of the command; if it should exist locally then the file must be downloaded or fileserver mounted to the user’s machine by an external process.

These complications led my team to opt for pinning the dependency in pyproject.toml on an upstream package that all downstream package depends on (i.e., the team’s common package) for practicality. Members in the team avoids learning a new tool or setting up connections to additional servers. It’s just simply pip install team-common-package (here be dragons).


Related to Dependency constraints for setuptools?


  1. Why not requirements/lock file per repository like a dev-requirements.txt? This does not work when a users wants to install multiple package which each has their own lock file that is unlikely to resolve successfully with each other. ↩︎