How to deal with external dependencies in pyproject.toml?

Hello Python Experts,

I see that there is a post on this already. However it’s pretty long and I do not want to spend the whole day reading it. What I need is to add XROOTD as a dependency of my project. Normally I would do:

# For the actual program
conda install xrootd

# For the python bindings
pip install xrootd

And I would like to make sure the user of my project cannot go further in the installation, if XROOTD is not installed. How can this be specified in the pyproject.toml?

Without this, it seems that the system will try to just install XROOTD, but for that it needs to build the library, for which it needs gcc, cmake, etc. This is definitely not what we want here. The user should just install XROOTD in any way he wants to aptitude, an RPM, buidling from source or through conda. And only if the program is installed, the user should be installing the bindings.

Cheers.

Currently, you can’t in a way that works across tools and especially not in a way that is enforced when installing a wheel.

At some point something might be added. To get that context, read the post you linked.

1 Like

One guaranteed way would be to use the system package and dnf/apt install the dependency and/or the project using the dependency. Then the distro packagers can guarantee you the desired experience.

In limited cases we can design a package to check for the system package before building a bundled version, but:

  • this requires always building from sdist
  • manually pre-installing the dependency

The post you linked could in theory resolve the user experience and give you control to automatically handle the system package installation at the pip install level. But there so many technical issues to resolve there, e.g. how you would specify the backend to install the system dependency from or how to define the required package needed to be installed for each distro.

But currently your best bet is to get in touch with distro/conda/spack etc. packagers and ask them to package the projects.

Dear @Lecris,

Thanks for your reply

What do you mean by this? In order to install XROOTD I do:

conda install xrootd

So the problem is not installing the package. The problem is that the user does not know that the package needs to be installed. So I am looking for something like:

RAISE Exception: Cannot install, missing external dependency XrootD

then the user would just install it. In practice what happens is that the machine seems to be trying to build from source XROOTD and that is just a mess, because you need a bunch of tools for that and we do not even want to go there. The binaries for XROOTD exist, the user only has to sudo apt install xrootd or something like that.

Cheers.

Hmm, but for system binding type of packages there could be another approach if we ignore windows systems.

Let’s assume we have a minimal python-foo project that is just bindings to a foo project typically installed on either distro, conda or straight from PyPI. python-foo would have an optional dependency on foo for the PyPI package, and internally it either has an RPATH link or a patched PATH to prefer this option.

But internally it can try to evaluate a simple smoke test (like running foo --version) and if it fails raise an ImportError prompting the user to choose how to install foo from system/pip. Conda and spack environment could probably also work :thinking:. Is this a good UX you could expect?

This all requires the project to be designed for this compatibility and be aware and colaborate with the packaging environments.

1 Like

If I really needed a dependency to be installed specifically from conda, I’d ship for conda-forge, not PyPi. Just saying.

If there was a clean way of doing this, the very Python bindings package for xrootd on PyPi needed, would itself already be doing it.