As of ~15 minutes ago, wheels uploaded to PyPI will have their METADATA files served along side them on files.pythonhosted.org, and the appropriate information served in the Simple API to determine if the file as available.
Since pip already has support, we anticipate we’ll hear any feedback rather quickly .
Otherwise, if things go smoothly the PyPI administrators will assess what backfilling looks like.
PEP 658 was mainly motivated by allowing a package installer to look at multiple versions of a project more efficiently for the dependency resolution process, but sdist metadata are in generaly not useful on their own for this purpose and thus left out of the proposal. But the current specification leaves enough room for anyone to submit a proposal for sdists to use the same interface.
ISTM that with Core Metadata 2.2 (PEP 643/Dynamic field for sdists), sdist metadata might in fact might be potentially quite useful for a lot of cases—but that’s also blocked on PyPI support, though I hear there is also a desire to get that unblocked at some point soon-ish in the future.
Would be cool to backfill some of the notorious fat wheels, like pytorch, tensorflow and the like. But maybe these are not the main contributors to PyPI bandwidth?
I hope everyone who helped make this possible realizes just how much this is an improvement for the ecosystem. Not only will package indexes be serving less (-> less power/bandwith) but package consumers will also be downloading less and running faster (-> less power/bandwith AND faster dev cycles).
So looks like the referenced PR only adds hashes to the whl URLs. I started a conversation with the torch folks to get true PEP 658 support (I already have a branch, it’s just missing checksums support. Also backfilling would be nice).
What were you expecting? Before sdist metadata can be exposed, we need reliable sdist metadata (i.e. metadata 2.2 support). I would not expect PyPI to ever bother exposing older metadata for sdists - certainly pip wouldn’t use it because we couldn’t rely on it.
Lastly, I got in contact with some folks at Meta [1] to help curry things along and also to have them upload the checksums for their files, for scraping.
This is all too exciting.
[1] I actually got in contact with the right people through a connection I made at PyCon this year (my first). Just goes to show the value of attendance is far beyond the content