FWIW, all versions of MSVC for the last eight years are compatible with all versions of CPython since 3.5.[1] Which I get is a bit confusing for people, but it does at least mean we don’t have to constantly update version numbers in docs every month.
The problem is that package developers don’t necessarily make their code work with MSVC in the first place, or they’re dependent on libraries that haven’t done it, or even OS functionality that doesn’t exist at all on Windows and fundamentally the package doesn’t make sense to port! So even if the user manages to get the tooling they can’t make the package work, because the developer never did.
The only thing likely to change on the MSVC side is a way for tools to be able to download and use a copy of the compiler. But we’re still talking 2GB+ downloads, which means virtually nobody should ever do this, and certainly not without getting the user to agree. Windows is designed around distributing binaries, and at this stage nothing much is going to change that - we’re far better off trying to fit into that than to resist it.
This is basically what everyone’s CI systems do, and it doesn’t actually make things any simpler.
Maybe when we get some build backends that are able to download non-Python dependencies as part of the sdist->wheel build, it’ll be feasible to use one of the existing CI systems to just -m build
from sdists in a clean environment, but that’s going to be a complete build rewrite for many projects. We’ve got a lot of “just make it work” culture to unwind first, or alternatively, a new community of external build scripts for existing packages that ensure they all build in the same environment (e.g. all the distros).
(FTR, I agree with everything you’ve said in response to my earlier post, which is why I haven’t quoted any of it )
This one I think is fine for pip install
to be the unified/generic command to show on PyPI, because it is how you get packages from PyPI. What’s missing is those cases where a user in a particular environment should not be getting certain packages from PyPI, but from their distributor.
I can imagine some kind of data file in the base Python install listing packages (or referencing a URL listing packages) that pip
should warn before installing, and probably messages to display telling the user what they should prefer. Distributors could bundle this file in with their base Python package so that anyone using it gets redirected to the right source.[2] A similar kind of idea is going into the sysconfig
work to let Linux distros specify their own settings, and I’m sure it generalises to packages, too.
The C++ library, however, is not. So when extensions use C++ (CPython does not), they may end up with conflicting versions that generally shouldn’t be an issue, but could potentially lead to shared state issues. But this is fundamentally the same issue as trying to share versions of
libpng
or any other shared library, and is really the responsibility of the system integrator to solve. ↩︎Though they could do this today by publishing their own index of wheels and setting a default
index-url
value in the site’spip.ini
/pip.conf
. There’s a few gaps in this, but fundamentally it’s straightforward and totally viable (and sufficient for my scenarios where I want to block PyPI entirely, just not for when you’re merely augmenting it). ↩︎