There is a C/C++ library called LibA, and a Python binding for LibA, called PyLibA (in my case, LibA = Qt5 and PyLibA = PyQt5).
There is a C/C++ library called LibB, which depends on LibA. I want to create a Python binding for that library, called PyLibB (in my case, LibB = poppler-qt5 and PyLibB = python-poppler-qt5). The Python binding is built on top of PyLibA. E.g., if there is a LibB function returning an instance of a LibA type, my binding returns an instance of the corresponding PyLibA type.
Since building C/C++ extensions is complex (have a look at Issues · frescobaldi/python-poppler-qt5 · GitHub, almost all issues are about installation…), I would like to provide wheels for PyLibB on PyPI.
My problem is that my wheels depend not only on the Python tag (i.e., Python implementation) and the (Python) ABI tag as well as the platform tag, but also on the version of LibA that is used, since my PyLibB extension links into PyLibA and therefore into LibA.
My question is this: is there a way, for one PyLibB version, to distribute wheels for several PyLibA versions? Or should I just pin the PyLibA version and distribute wheels for that version only, constraining all users of my PyLibB library to use a specific version of LibA?
Also, is conda maybe more appropriate for that sort of thing than the PyPA ecosystem? (I’ve never used it.)
NB, you might want to move this to the #packaging category instead—you’re likely to get better expert help there.
I’m not a C extensions expert, but a somewhat similar problem crops up with things like different CPU/GPU/CUDA flags/implementations for PyTorch, TensorFlow, etc. You can’t do it with different wheels for the same distribution name, but there are some ways to achieve this—PyTorch does it with an extra index URL with different indices for different CUDA versions that serve the correct wheels, but you could also do it by just having different dependencies in a metapackage or “core” package that doesn’t contain any binding-specific code, delegating to binding-specific dependencies via extras.
Yup, this is one of the things Conda was designed to handle better than the PyPA tools. You can have a single package recipe with multiple outputs, one built against each version of the binary dependency with different dep specs, and give them different build tags accordingly. You can also have constraints that don’t specify a hard requirement, but rather just require a specific version of a package if it is installed. And you can directly specify the supported Qt versions to be solved against and installed, if needed. And you can also more easily create meta-packages and feature flag packages to control the dep version.
OK, I’ll do that next time. I was a bit afraid of posting about user problems, interpreting that category as intended for development of packaging tools and not for questions. But I guess complex packaging problems call for discussions on improving the tools as much as on how to work around their limitations.
I don’t think I have the permissions to move this topic now that it’s been created, though?
Thanks, good to know I’m not missing something obvious. The workarounds do sound complicated.
Thanks again. I’ll take a look and see if this ends up easier. python-poppler-qt5 was mainly developed for the benefit of one app (Frescobaldi) and I am mostly interested in this app, so I will most likely end up requiring PyQt5 == x.y in my wheels in the near future (edit) short term, but it would be nice to make the library more generally useful outside of Frescobaldi and that may entail distributing it on conda.
You should be able to at least as TL2, just click the edit button () next to the thread title and you should get the option in a dropdown over the category name. I know I can move others threads as a TL3 (but I haven’t been as active for a while so I got downgraded, otherwise I’d do it for you).
Yes, and in fact there are already separate conda packages for qt and pyqt (as opposed to PyPI, where a binary PyQt5 package bundles the underlying Qt with it).
I have a question related to this. Suppose I accept the fact that I’ll be shipping my extension with a pin of the PyQt5 version. Is there a way to do this only for the wheels, but leave a lax requirement in the sdist for people building from source?
Good point. Ideally, I would test with more versions… but I don’t have the time.
Not sure what you mean by “a different source”?
And, yes, I know we should port to Qt6. Basically, we’re a few volunteers trying to take over maintenance of a project (the Frescobaldi music sheet editor) whose initial author doesn’t have time for it anymore. First we want to fix lots of packaging / distribution / installation issues, then the next thing to look at will probably be Qt6.
By the Qt Compnay, perhaps, due to their new strategy to phase out FOSS LTS releases and push their commercial releases instead, but few folks get Qt directly from them and distributors (Conda, Linux distros, Homebrew, KDE) are likely to support it for many years to come, like with Qt4 and GTK2. My impression of existing PyQt/PySide users is that a substantial majority of production installs (at least on the scientific and related side) are still using Qt5. In fact, some even only dropped Qt4 support in the last few years.
Yeah, it’s not easy. I help maintain QtPy the most widely used compatibility layer between PyQt and PySide, and Qt5 and Qt6 (previously Qt4 and Qt5 until the past couple years). Of course, that’s aimed at the Python level, not C API users, though perhaps it will be at least a helpful guide.