Linking against installed extension modules

The (most obvious) problem with linking directly to another package’s extension module is that our packaging system does not allow specifying dependencies closely enough to be sure you’ll get something that matches after installation. This is why wheels tend to bundle copies of dependencies rather than linking to another package (like in conda, where the dependencies are specified well enough).

Isolated builds also prevent you from doing this when building at install time, as your build time dependencies may differ from the existing dependencies in your target environment.

It can work if the dependent package is reliable enough and careful enough with their public API that they promise not to change it basically ever, or have an extensibility model that lets them manage it (and the processes to actually manage it, not just to say they’re going to manage it). Basically, it’s a large burden on the package you want to use, in order to reduce the burden on you, the user.

Numpy’s approach is probably the easiest way to scale and maintain a public native API. If they want, they can add a new attribute for a new version of the API (personally I’d have made it a function rather than an attribute, for deprecation warnings/lazy initialisation), and they can ensure that a particular API object is aware of the current interpreter, module state, and any other not-quite-global state that matters.

Another approach might be to replace the single capsule with an API object that lets you request certain entry points (e.g. by name or some unique identifier), so that rather than a C struct, you would request each function individually and only deal with a single function pointer.

But ultimately, it’s more work and maintenance for the implementer than the consumer. Make it worth their time :wink:

(There’s no doubt some useful further reading at https://pypackaging-native.github.io/ though I don’t recall if there’s anything specifically addressing this scenario.)

3 Likes