It would probably make more sense as an ABI tag rather than a platform tag (or more of a conda_win_amd64
type platform tag), but the principle makes sense. If Conda also searched PyPI for packages, this would mean packagers would just have to publish a few additional wheels that:
- don’t vendor things available as conda packages
- do include additional dependencies for those things
- link against the import libraries/headers/options used for the matching Conda builds of dependencies
Those three points are the critical ones that make sharing builds between Conda and PyPI impossible (or at least, against the design) regardless of direction.
Numpy installed through PyPI needs to vendor anything that can’t be assumed to be on the system. Numpy installed through Conda must not vendor it, because it should be using the same shared library as everything else in the environment. This can only realistically be reconciled with multiple builds and separate packages (or very clever packages).
But of course, the majority of packages aren’t in this category, so could be shared just fine. And those that are probably have good coverage already, because any Conda user will have discovered they don’t just work and gotten them into conda-forge.
To add my hot-take in here: the best thing that the PyPI side of the community can do is to be more honest about how limited it is at solving the “compatible packages” problem. Our version dependencies assume pure-Python ABI, and have no way for builds to require specific builds (very little of our tooling lets you do things like “sdist requires spam>1.0
but wheel requires spam==1.7
”). But we let people go out there and declare “wheels the solution to everything” or some such thing, while most of us deep in the system aren’t doing that, because we know.
We know that:
- you need specific builds of Python
- you need specific package builds to match those specific builds
- preinstalled Pythons probably don’t match PyPI built packages
- native dependencies need to be installed separately, and you probably have to read the package docs to figure out what you need
- you might have to add additional constraints to get a working environment, not one that merely satisfies the metadata dependencies
I have seen professional software engineers mess up all of these (and have messed them all up myself at times). Why are we surprised that regular users look at what comes preinstalled and assume it can handle these, and then get frustruted when they find out what we always knew? (And get even more frustrated when they find out that we already knew and haven’t fixed it or warned them!)
From my POV, the minimum we’d need to do for Conda to be able to use wheels is:
- define “native requirements” metadata (even if pip ignores it)
- allow (encourage) wheels with binaries to have tighter dependencies than their sdists
- encode and expose more information about ABI in package requirements
- define some set of “conda” platform or ABI tags
Everything else can be done on the conda side, integrated into their own builds of CPython. But the viability of Conda depends on the above features, and we just don’t have equivalents on the wheel/sdist side right now.