As of ~15 minutes ago, wheels uploaded to PyPI will have their METADATA files served along side them on files.pythonhosted.org, and the appropriate information served in the Simple API to determine if the file as available.
Since pip already has support, we anticipate we’ll hear any feedback rather quickly .
Otherwise, if things go smoothly the PyPI administrators will assess what backfilling looks like.
Awesome news, thanks to everyone who worked to get this in place!
This is a nice enhancement.
sdists similarly have a PKG-INFO file, I wonder if there are any plans for those?
PEP 658 was mainly motivated by allowing a package installer to look at multiple versions of a project more efficiently for the dependency resolution process, but sdist metadata are in generaly not useful on their own for this purpose and thus left out of the proposal. But the current specification leaves enough room for anyone to submit a proposal for sdists to use the same interface.
ISTM that with Core Metadata 2.2 (PEP 643/Dynamic field for sdists), sdist metadata might in fact might be potentially quite useful for a lot of cases—but that’s also blocked on PyPI support, though I hear there is also a desire to get that unblocked at some point soon-ish in the future.
Is there any good news to share about bandwidth savings or any major issues so far?
I don’t suppose any further news on whether there will be some backfill operations happening?
Would be cool to backfill some of the notorious fat wheels, like pytorch, tensorflow and the like. But maybe these are not the main contributors to PyPI bandwidth?
We’re tracking some issues with the interaction between a bug in PyPI’s implementation and pip’s handling of 658 enabled JSON indexes.
No backfill will occur until those are sorted in a coordinated way.
PEP 714 has been accepted and implemented and is deploying now.
Once we get some feedback that things are good, we will start designing a backfill mechanism.
PEP 658/714 support has been merged in pip, and will be available in the next release (due in July).
If you’re in the Data Science community, Torch cheeseshop support has a torch repo ticket, an issue on the repo that supports their cheeseshop and a PR that claims to add the necessary support. So soon, doing a
--dry-run or using another tool like Poetry or Pex might skip your >GB download for lockfile resolution.
How monumental, y’all.
I hope everyone who helped make this possible realizes just how much this is an improvement for the ecosystem. Not only will package indexes be serving less (-> less power/bandwith) but package consumers will also be downloading less and running faster (-> less power/bandwith AND faster dev cycles).
So looks like the referenced PR only adds hashes to the whl URLs. I started a conversation with the torch folks to get true PEP 658 support (I already have a branch, it’s just missing checksums support. Also backfilling would be nice).
Also, just look at this URL. It’s so beautiful: https://files.pythonhosted.org/packages/70/8e/0e2d847013cb52cd35b38c009bb167a1a26b2ce6cd6965bf26b47bc0bf44/requests-2.31.0-py3-none-any.whl.metadata
I’m very eager for the backporting.
What were you expecting? Before sdist metadata can be exposed, we need reliable sdist metadata (i.e. metadata 2.2 support). I would not expect PyPI to ever bother exposing older metadata for sdists - certainly pip wouldn’t use it because we couldn’t rely on it.
Ah sorry Paul, I meant my own referenced PR for the pytorch index, which only exposes the hashes and not the rest of PEP 658 support.
OK at the risk of being a slight bit noisy. Another update on the Torch index front (because lets be real, that index has some HUGE wheels )
Provide file hashes in the URLs to avoid unnecessary file downloads (bandwidth saver) by matteius · Pull Request #1433 · pytorch/builder · GitHub is primed and ready to add sha256 hashes to packages (and when supported, metadata files). Kudos to @matteius
I also opened Support PEP 658, without hashes by thejcannon · Pull Request #1457 · pytorch/builder · GitHub to support PEP 658/714 metadata hosting.
Lastly, I got in contact with some folks at Meta  to help curry things along and also to have them upload the checksums for their files, for scraping.
This is all too exciting.
 I actually got in contact with the right people through a connection I made at PyCon this year (my first). Just goes to show the value of attendance is far beyond the content