Writing C extensions so that the compiled extension can be used by multiple pythons

In linux we can reuse our compiled extension by preloading the python library.
However this method does not work in windows probably because of this:

MS_COREDLL define.
Actually it almost works. I went into the python3.12 directory and make a copy of python312.dll and called it python311.dll and then it works! But obviously this is not great.
I assume Py_LIMITED_API would solve my windows issues? Our extension is complex so I’m not sure it has everything we need.
Does it have a bad impact performance wise?

Why doesnt preloading python before the extension work on windows? I tried python3.lib obviously.

I thought ABI 3 is supposed to be making things work for the minimum version of py3.x and higher w/o any extra steps… Perhaps a more complete example would be useful? cryptography, for example, is publishing abi3 wheels and they work just fine: cryptography · PyPI.

For my PyCXX project I test the unlimited and limited API.

When I test the limited API I test each API level and run the code against all python versions that support the API level.

If you are using the limited API then what you want is works.

But as you say there are cases where the limited API is not sufficient to support some use cases. In that case you are going to need to build one .pyd for each python version you want to support.

I’m not completely sure what you are doing here, so this is a guess.
If you built against the unlimited API on a specific version of python then you are locked into that version of pythons exported symbols.
When you try to use another version of python some of those symbol you built against are either missing or changed.

Our extension is compiled without Py_LIMITED_API on windows for python 3.11. It does use non limited APIs. I have it linking against python3.lib. I have the process that loads the extension preload python3.dll from the 3.12 directory before it loads the extension. The 3.11 directories are not accessible to the process. Unfortunately, the dll trace shows that the 3.12 paths need to have python311.dll. If I make a copy of python312.dll and call in python311.dll everything works.

This whole idea works on linux, but not on windows. You cant just preload the right python there.

Not by design. It is a accident that it works any you can expect things to break in unexpected ways.

There are C macros that may generate code that is not the same in each python version that must match the data structures of that version.

If was this easy then there would be no need for the limited API to exist.

3 Likes

Thank you Barry. I’ll see if Py_LIMITED_API works for us (we use PyTypeObject which is opaque so IDK). Can you give an example of a macro that generates the wrong code going from, lets say 3.8 to 3.11?

Sorry I do not know of a specific example off the top of my head.
I have been following python development discussions for many, many years and this issue came up a one point in the past.

Let me ask the question a different way. We define several PyTypeObjects in our extension and have a giant 30 line INIT where we fill out tp_* entry points, with a couple being ones that we care about (from tp_name to tp_new with tp_print being unset). Do you think that would ever work with Py_LIMITED_API?

I think defining types uses a new API. You would need to check the docs.
Also you can define the Py_LIMITED_API appropriately and try building your code. If you use an API that is not part of the limited API you will get compiler errors.