Is there a way that the shared libraries needed by Pybind11 can be found at runtime?

I am currently using Pybind11 to create Python extensions for my C++ library, but now I have encountered a problem: I write and package the extension into a wheel package, but at runtime it will report an error saying that the corresponding shared library cannot be found :

Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
ImportError: cannot open shared object file: No such file or directory

Part of my code is as follows:

ext_modules = [

    cmdclass={'build_ext': build_ext},

This wheel package runs fine after being installed on the machine where I packaged it, but if I change this package to another machine, the above error will be reported.

I think the problem lies in runtime_library_dirs=[sysconfig.get_config_var("LIBDIR")], because this causes the search path of the extension runtime to be fixed on the LIBDIR of the python environment of the packaging machine when packaging, so in another It does not work on machines with different environments. Is there any solution to solve this problem?

P.S. I also want to do similar packaging on MacOS. Is there anything I need to pay attention to?

On linux what does ldd report for the installed .so?

I assume that you will see that a dependent .so is missing on the target but is present on your build system.

What I do is install the dependent .so files in the same folder as the .so extension. But you might want to have the user install system libraies for there linux distro etc.

But on macOS you will need to always provide rhe .dynlib’s.

Thank you for your reply, but I don’t want to install these dependent .so in the Linux system library path. I just want to install it in any location, as long as it can correctly obtain the .so library at runtime.

So according to what you mean, assuming that the Python extension library I built is, and the runtime shared library it depends on is, then do I only need to install in the same directory as Can it run normally in the same directory?

Yes that should work.

I built the wheel and install it as the follow:


The depend on, but when I import the module, it run with the same error. XP

You meed to use the debug options to see what is going on.
See man and the env vars it uses, in particular look at LD_DEBUG

I think the problem may be that this .so is still in the wrong place. Without rebuilding, try to copy the to wherever sysconfig.get_config_var("LIBDIR") points to (which will not be the same as the site-packages dir), then check again.

If the dependency is merely a dependency (not itself imported by import statements), you can actually install it wherever you want (and have write permissions), but if it’s not in a standard place, you’d also need to set the LD_LIBRARY_PATH (or DYLD_LIBRARY_PATH).

Thank you for your reply! Actually I want to package modules and upload wheel packages to platforms like PyPI for distribution, so I want these wheels to run normally after being installed on different machines. However, if runtime_library_dirs is fixed in, it may be invalid on different machines (there may not be such a path at all on different machines). In this case, I thought about whether there might be the following solutions or problems:

  1. Install the shared libraries that the module depends on in the system standard path.
  2. Is there a solution that enables wheel to automatically set the module search path to the specified installation path during the installation process?
  3. For this kind of wheel package for distribution, should we try not to rely on any shared libraries to avoid the above troubles?

I finally solved the problem, in order to achieve this goal, I need to use auditwheel to repair the whl package. After that, the module can run normally. Thank you to all those who helped.:yum:

1 Like