Allow package references as version specifiers

Hello,

I’d like to illustrate the problem using the grpcio and grpcio_tools packages: both are currently at version v1.49.0 and both live in the same Github repo, in subfolders src/python/grpcio/ and tools/distrib/python/grpcio_tools/, respectively. Together with other packages and other languages, their versions bump in lock-step when released.

When I use both packages in a project then I can’t pin them because other package dependencies would conflict with that pin, and pip would be unable to resolve the conflict. So I end up with dependency declarations like

dependencies = [
    "grpcio >=1.46.0,<2.0.0",
    "grpcio-tools >=1.46.0,<2.0.0",
]

And that’s where things get a little iffy: it may happen due to dependencies declared in other packages that the two packages install in different versions. If the packages use semantic versioning correctly then all may be well, as is the case with grpcio-tools and its dependency on grpcio (code) — still there is a good chance that the two packages install at different versions.*

Packages whose type stubs ship as a third-party package are other examples of the problem.

I wonder if it would make sense to express a “package reference” as a version specifier (expanding on PEP 440), for example:

dependencies = [
    "grpcio >=1.46.0,<2.0.0",
    "grpcio-tools @=grpcio",
]

meaning that both packages have the same version range but eventually are expected to resolve to the same installed version within that range. If the target package of a @= isn’t specified then that’d be an error; if the target package pins then that same pinned version would apply.

Considering that @ is already used for direct file references, using @= may be confusing or ambiguous.

I’m curious what people make of this :nerd_face:
Jens

—————
* Other packages, however, are out of lockstep completely as is the case with googleapi-common-protos at v1.56.4 and its third-party, unmaintained stubs package at v2.0.0. Likewise, the protobufs package at v4.21.6 (for Python) and stubs in typeshed at v3.20. Ideally, I think, they ought to release at the same versions but that’s a different issue altogether.

I’m not too keen on this. This changes the context for parsing and handling a dependency requirement from something that can be evaluated and processed independently to something that has external dependencies itself and needs a preprocessing step.

Is there some reason this couldn’t be achieved outside of custom syntax in the Packaging tooling, such as autogenerating this via cog or something similar?

1 Like

I agree, this seems like something where you would have been better off designing things differently so that they worked with the existing mechanisms.

I don’t really see why the 2 libraries have to be in lock step like this. But if they do, then judging from the names it seems like if you use grpcio-tools, you need grpcio. So why not have grpcio-tools X.Y.Z depend on grpcio X.Y.Z? That expresses the dependency explicitly. Or if the two libraries really are completely independent (which seems weird given the tight version coupling) you could create a dummy package grpcio-base, and have grpcio X.Y.Z depend on grpcio-base X.Y.Z, and grpcio-tools X.Y.Z also depend on grpcio-base X.Y.Z.

I’ve probably missed something in the above. But my point stands, this sounds like something you could probably solve within the existing functionality, as long as you’re willing to adjust your package structure to make it happen.

I don’t really think we should add extra functionality just for a single, very specific use case like this.

4 Likes

A Hatch user just solved a similar use case with a custom metadata hook: Question: How to replace the relative path to version while building? · Issue #469 · pypa/hatch · GitHub

pyproject.toml:

[tool.hatch.metadata.hooks.custom]

hatch_build.py:

from hatchling.metadata.plugin.interface import MetadataHookInterface

class CustomMetadataHook(MetadataHookInterface):
    def update(self, metadata):
        metadata['dependencies'] = [...]

If you just want relative direct references, Hatch supports this use case too, see “Tip” in Dependencies - Hatch

1 Like

And approach of handling this in a custom plugin for a package build system takes seems like a good one (and might be a better answer than the cog-style code generation I suggested above). I’d prefer that this be handled there, instead of in the dependency specifiers that indexes and installers like PyPI/pip need to handle.

They should really be declaring dependencies as dynamic if they are doing that :slight_smile:

2 Likes