Part of the issue with any of these discussions is that:
- The actual problems (related to compiler toolchains, ABIs, distributing packages with compiled code in them, being able to express dependencies on non-Python libraries and tools, etc.) are quite complex,
- Most people here don’t have those problems as package authors, and in many cases they don’t have them as users either (simpler packages with some C/Cython code work fine as wheels),
- The solutions to those problems do necessitate some overhead, which make them hard to accept for folks that don’t have the problems,
- The scientific computing and ML/AI community hasn’t always explained the problems in enough detail. Often it’s a long email/discourse thread about one specific topic, and folks talk past each other because possible solutions are brought up before the problem is very clearly explained.
That makes it difficult to get anywhere with this conversation.
I would also say that it’s not only Conda that solves these problems. PyPI has quite fundamental problems when dealing with complex packages/dependencies with C/C++/CUDA/Fortran/etc. code. Those kinds of problems are solved (mostly) by Conda, but also by Spack, Nix, Homebrew, Linux distros, etc. Of those, Conda, Spack and Nix all have the concept of environments that you can activate/deactivate.
I’ll do a little pre-announcement here: I’m making a serious attempt at comprehensively describing the key problems scientific, ML/AI and other native-code-using folks have with PyPI, wheels and Python packaging. That as a standalone website (first release by the end of the year) which is kept up to date, aimed to serve as a reference, so we hopefully stop talking past each other. It will not have proposed solutions - this Discourse is the place for that. At most (for now) it some outgoing links to key PEPs and discussions around potential solutions to some of the issues.
I’ll reach out to invite you to participate in helping shape the content on some of the topics that you’ve brought up.