Allow package references as version specifiers


I’d like to illustrate the problem using the grpcio and grpcio_tools packages: both are currently at version v1.49.0 and both live in the same Github repo, in subfolders src/python/grpcio/ and tools/distrib/python/grpcio_tools/, respectively. Together with other packages and other languages, their versions bump in lock-step when released.

When I use both packages in a project then I can’t pin them because other package dependencies would conflict with that pin, and pip would be unable to resolve the conflict. So I end up with dependency declarations like

dependencies = [
    "grpcio >=1.46.0,<2.0.0",
    "grpcio-tools >=1.46.0,<2.0.0",

And that’s where things get a little iffy: it may happen due to dependencies declared in other packages that the two packages install in different versions. If the packages use semantic versioning correctly then all may be well, as is the case with grpcio-tools and its dependency on grpcio (code) — still there is a good chance that the two packages install at different versions.*

Packages whose type stubs ship as a third-party package are other examples of the problem.

I wonder if it would make sense to express a “package reference” as a version specifier (expanding on PEP 440), for example:

dependencies = [
    "grpcio >=1.46.0,<2.0.0",
    "grpcio-tools @=grpcio",

meaning that both packages have the same version range but eventually are expected to resolve to the same installed version within that range. If the target package of a @= isn’t specified then that’d be an error; if the target package pins then that same pinned version would apply.

Considering that @ is already used for direct file references, using @= may be confusing or ambiguous.

I’m curious what people make of this :nerd_face:

* Other packages, however, are out of lockstep completely as is the case with googleapi-common-protos at v1.56.4 and its third-party, unmaintained stubs package at v2.0.0. Likewise, the protobufs package at v4.21.6 (for Python) and stubs in typeshed at v3.20. Ideally, I think, they ought to release at the same versions but that’s a different issue altogether.

I’m not too keen on this. This changes the context for parsing and handling a dependency requirement from something that can be evaluated and processed independently to something that has external dependencies itself and needs a preprocessing step.

Is there some reason this couldn’t be achieved outside of custom syntax in the Packaging tooling, such as autogenerating this via cog or something similar?

1 Like

I agree, this seems like something where you would have been better off designing things differently so that they worked with the existing mechanisms.

I don’t really see why the 2 libraries have to be in lock step like this. But if they do, then judging from the names it seems like if you use grpcio-tools, you need grpcio. So why not have grpcio-tools X.Y.Z depend on grpcio X.Y.Z? That expresses the dependency explicitly. Or if the two libraries really are completely independent (which seems weird given the tight version coupling) you could create a dummy package grpcio-base, and have grpcio X.Y.Z depend on grpcio-base X.Y.Z, and grpcio-tools X.Y.Z also depend on grpcio-base X.Y.Z.

I’ve probably missed something in the above. But my point stands, this sounds like something you could probably solve within the existing functionality, as long as you’re willing to adjust your package structure to make it happen.

I don’t really think we should add extra functionality just for a single, very specific use case like this.


A Hatch user just solved a similar use case with a custom metadata hook: Question: How to replace the relative path to version while building? · Issue #469 · pypa/hatch · GitHub



from hatchling.metadata.plugin.interface import MetadataHookInterface

class CustomMetadataHook(MetadataHookInterface):
    def update(self, metadata):
        metadata['dependencies'] = [...]

If you just want relative direct references, Hatch supports this use case too, see “Tip” in Dependencies - Hatch

1 Like

And approach of handling this in a custom plugin for a package build system takes seems like a good one (and might be a better answer than the cog-style code generation I suggested above). I’d prefer that this be handled there, instead of in the dependency specifiers that indexes and installers like PyPI/pip need to handle.

They should really be declaring dependencies as dynamic if they are doing that :slight_smile:


I’d like to clarify and perhaps expand with a more general example:

We see many packages out there, popular ones included, that don’t contain typing annotations. For example, issue #795 of the Babel package discusses the problem and hints at a second package in Typeshed. Fortunately, these two are versioned in lock-step:

  • Babel is currently at release v2.10.2 and
  • Its type stubs are declared as v2.10.* which matches.

If I’d like to use Babel and its types then the above idea would enable me to specify

dependencies = [
    "babel >=2.8.0,<2.11.0",
    "types-babel @=babel",

and whichever tool resolves package dependencies would ensure that both package versions are the same within the specified range.

Of course, packages whose types are not versioned in lockstep with their package would fail here.

@ofek I’m curious to learn more about your suggestion—it would require me to switch from Flit to Hatch I assume, and then… ?

types-babel does not track babel. For X.Y.Z, only X and Y are kept in sync with babel; the micro/patch version denotes the version of the type definitons for babel’s X.Y release. The latest version of babel is 2.10.3 and that of types-babel is 2.10.0. This is the case for the majority of third-party stubs maintained in the typeshed which (are assumed to) follow semver. (Calver package stubs get a fourth a segment with the stub version, e.g., with 0 being the stub version.) Depending on how you envision types-babel @=babel to work it might either be unresolvable or it might “artificially” resolve babel down to an older version.

I would assume the type checkers will download the type annotations as needed based on the version of the library installed so you don’t have to list the explicit dependency.

I’m not aware of a type checker that does that. How would they know which version to download?

You can get it from the .dist-info directory for the installed library.

How would they know which version of the published stubs is compatible with the installed library?

mypy has --install-types.

Well, like it says on the tin, that includes only a smaller number of stub packages, and I don’t think mypy does any version checking. But it’s good to know that the option exists :slight_smile:

Unfortunately, AFAIK that mostly if not exclusively just uses the centrally maintained, automatically deployed typeshed-generated stub packages plus maybe a few extra, and as it says in the linked doc,

For security reasons, these stubs are limited to only a small subset of manually selected packages that have been verified by the typeshed team.

It doesn’t work for most arbitrary stub packages in the general case, as I believe is being discussed here.

If types-babel declares a dependency on the matching versions of babel, then a downstream package can just depend on ["babel >=2.8.0,<2.11.0", "types-babel"], and pip should make sure that it picks a compatible version of types-babel.


2 posts were split to a new topic: Having build dependencies available as install dependencies

A post was merged into an existing topic: Having build dependencies available as install dependencies

Anyone know what the best docs are to have updated to suggest this?

And in case anyone else who has been around forever was thrown by the types- prefix instead of the -stubs suffix, Type Stubs — typing documentation lists the prefix while PEP 561 – Distributing and Packaging Type Information | lists the suffix.