Pipenv also explored using the JSON API a while ago, but eventually came to the conclusion that it’s not worth it. Aside from the various problems Warehouse has inspecting upload artifacts, there’s a fundamental problem in how the API presents the information.
By design, each wheel uploaded (for the same version) can have different sets of dependencies, but the JSON API would only display a random (the first uploaded? I have no idea) set for each version. Sdists are even worse, since the same artifact would prodoce different dependencies running on different machines! (e.g. if
setup.py inspects C libraries on build time) This makes some of our users, uh, unsatisfied.
Now you might say this is an upstream problem; Python offers enough declarative syntax for packagers to declare unified dependencies for artifacts of the same version. But in practice maintainers have different reasons to not do it. Legacy consideration is one common reason (some high-profile projects support as low as pip 1.5). Or the maintainers might just not care; I’ve had pull requests rejected because things already work, and they see no value of improving the metadata. And honestly, why should they care?
In the end we ditched all JSON API calls for straight download-build-inspect, since it is the only reliable way to get the dependency set the user expected.
I guess what I’m getting to is that while it might seem like an easy step to fix PyPI’s dependency presentation, it also might not be as worthwhile as you expect