What information is needed to choose the right dependency (file) for a platform?

Selecting the write thing to install to satisfy a dependency seems to be:

  1. Name
  2. Version requirements
  3. Environment markers
  4. Supported wheel tags

The latter two are specific to the environment being installed for.

Is there something I’m missing, or does knowing the values of the environment markers and the list of supported wheel tags provide all the info required to resolve whether a requirement works for a specific platform and Python environment?

1 Like

I think there’s some subjective things that determine which artifact is ‘best’ too, e.g. pip’s --no-binary, --only-binary and --prefer-binary flags, unless you’re assuming one of these strategies is always ‘best’.

1 Like

I’d call those “format preferences” or whatever. :slight_smile:

I can’t come up with anything else – These five seem to be the relevant things.

Pretty much. Those are the only two selection mechanisms we have – which is what part of what makes having wheels that support GPUs an annoying problem to solve. :slight_smile:

1 Like

That’s the way I’m thinking about it as well.

Great!

Just to be transparent, I’m thinking about what an environment “dump” would look like in order to create a lock file for a platform other than the one you’re running on. For instance, let’s say you have a lock file for local development on macOS, but you deploy to a cloud host. How does that cloud host help you keep synchronized lock files? They could give you a Docker file to generate your lock file in, but another might be some JSON file hosted somewhere with all the relevant environment details to act as input to the resolver. That way you could use a locker to generate lock files both for macOS and your cloud host simultaneously.

1 Like

Adding a cross-reference to a pip issue, where there’s some discussion about making it possible to specify this environment information available externally to pip:

1 Like

I replied over in the pip issue to avoid splintering the conversation.

1 Like