PEP 711: PyBI: a standard format for distributing Python Binaries

I think this is a bit unfair, in the recent discussion there have been a number of people who have raised concerns about the wheel approach. https://pypackaging-native.github.io is the result of discussions here [I know many people reading this are well aware :wink: ] and while looking for a link to the PEP 704 discussion I came across a question abut how to share so s between wheels.

Many people are choosing to disengage, are using more suitable to their problem tools (system packaging, conda, containers), or quietly finding workarounds to put non-Python dependencies in wheels.


I want to co-sign basically everything @BrenBarn said above

I also suspect that if you go down @indygreg 's suggestion of shipping you own compilers and start having non-Python software in wheels there will be pressure to put shared libraries into their own wheels (e.g. packaging libhdf5 for h5py , pytables, and netcdf to depend on) and then you are most of the way to re-writing conda.


I would say the sdists uploaded to pypi are the backbone of the Python ecosystem, not the wheels. Eliding sdists and wheels to be “at the same level” is not correct. As I said in another thread sdists are a point of truth for what a release “is” and wheels are binary artifacts for one (of many) binary package managers that is derived from the sdist (that by historical path happens to be hosted adjacent to the sdists).


In all of these discussions I am not sure I have a clear idea of what about conda does not serve people well. Among reasons I think I have heard:

  • wall time to get from a tag to <tool> install package working with conda forge. But that is a cost of a central build farm [ok, public CI] and can be solved by a local channel on top of conda-forge
  • does not work with python.org binaries. But that is because conda provides its own Python that is built consistently with all of the c-extensions.
  • the solver is slow. But that is due to trying to be “correct”, some choices about what versions to (continue) to expose, and they just switched to using a faster solver implementation
  • it is controlled by a company. But that is not true anymore
  • conda envs have to be activated. But that is because you can install scripts / env variable to be set on activate. Some is this basically direnvs but for environments not paths and some of this is getting c-libraries to behave correctly in all cases.

From my comments, I am not particularly persuaded by these arguments. Are there others I am missing or am I not giving these issue enough weight?

7 Likes