Python Packaging Strategy Discussion - Part 1

This seems to (pardon the pun) wish for a reinvention of the wheel, aside from the (IMO) completely open question why OS vendors would even want to invest many, many personyears[1] into something that’s already working from their POV (the introduction of a new architecture is a very exceptional event in that regard, where the vendor has a strong interest to bootstrap the new ecosystem, but without necessarily tying themselves to longterm maintenance).

I don’t see why an OS-provided package would even be that much more preferable to that of any other distribution for that OS, because it couples you to the OS’s upgrade cycle of various key infrastructure packages, which – while the norm – can be avoided[2].


  1. it’s a black hole. once you start building a couple packages, your users will want other packages. and now you’re running a new distribution that needs to be kept up to date, which is a huge job even assuming you already have all the infrastructure for it. ↩︎

  2. perhaps not for libc or the graphics support stack, but for example, by shipping its own libcxx, conda-forge can use newer features than available on the one of the OS, and provide current packages on old MacOS, which are still in broad use despite being EOL’d by Apple. ↩︎