Using ABI instead of release version in site-packages path

Related to the accelerated release cadence, one of the main pain points of new releases for Extension-developers is the fact that release tooling can be cumbersome and slow to update to add entries for 3.9, 3.10, etc. The need for this is mainly because Extensions must be recompiled to be compatible with the latest ABI. A corollary of the accelerated release cadence is that there could be more consecutive releases that are ABI-compatible, so the only real difference between installations is the path. That can be frustrating. If two Python versions are ABI-compatible, it would be nice if a wheel / installation could be considered fully compatible such that a new wheel/etc. need not be built. Using the ABI version in the path instead of the Python minor version is one possible path to decoupling compatibility from the increasingly rapidly changing version number. For example lib/pythonabi36m/site-packages/pkg/.

Does this seem like it could be feasible?

Conda package maintainers of pure Python packages can feel this pain because they must publish a new build for the new Python when no new release has been made and nothing needs to be done for PyPI (conda’s noarch builds alleviate this for some packages, but certain features, such as platform- or version-dependent dependencies, cannot be used in noarch conda packages). The only difference in these cases for conda packages is the version number in the site-packages path.

I’m interpreting the question as using ABI in the path by default, since you can easily do this on a per-environment (virtual or not) basis with symlinks.

The path schemes are hard-coded in Python, and packaging tools can’t change them without breaking Python. I guess it would be theoretically possible for CPython to change this (the scheme is implementation-dependent, so you’ll need to propose changes to PyPy, Jython, etc. individually), but there are too many compatibility issues in practice this can probably never happen.

Is this about building packages or about where they get installed on the end-user’s system?

Packages with non-stable C-extension modules must recompile to get the new python version in the c-extension name, even if they are ABI compatible. There is no generic cp3 python tag for c-extension modules. Please correct me if I am wrong. So we cannot just point a new version of Python to the older version’s site-packages directory without installing the newer version of the package.

certain features, such as platform- or version-dependent dependencies, cannot be used in noarch conda packages

Are there platform dependencies other than c-extension modules that prevent packages from being noarch? If the package has version-dependent dependencies, upgrading the version of those dependencies will require a rebuild.
Or maybe I misunderstood something.

Nope, you’re right. See for the enhancement request to add such a tag.

I think the main problem is that this would confuse people’s understanding of where packages go when they get installed, though I certainly appreciate the intent. Perhaps conda needs its own relocation functionality so that pure-Python packages can build for lib/PYTHONTARGET/site-packages/pkg and resolve the variable at install time?

IIUC, pip-like tools assume the target will be under the lib path, and use -data items to put files outside. So at least the majority of basic packages don’t even think about it.

But it’s also just a sys.path entry, so no reason why Python in a conda environment can’t use a different one. You’re in a “controlled” environment at that point, so it may be reasonable to remove some of the safety measures we need to use in less controlled ones. If changing that location isn’t just a patch to sysconfig (and distutils.sysconfig, for now), then it ought to be.

(Aside: I think it’s very unlikely that we’ll get adjacent releases with fully compatible ABIs, but at least there’s renewed interest in making ABI3 viable again.)

FWIW, we nearly added it for Windows, and then dropped it. But Windows still supports untagged .pyd modules (and never risked mismatched Python versions anyway), you just need a custom build process to omit the tag.

A bit of both, but mostly about building.

This is what I’m trying to get at at. If it’s literally only the version number that makes them incompatible, then it seems like maybe we should work on a way for that to not be the case. Thanks @brettcannon for the link!

Yes, a common example would be depending on pywin32 on Windows only. This forces a package to adopt full per-arch per-Python builds, even if the package is pure Python. Conda doesn’t have an equivalent to environment markers that is evaluated at installation time, instead all conditional dependencies are evaluated at build time by templating done by conda-build. This is really a conda issue, though, and I should probably submit a separate feature request to conda to allow installation-time evaluation of conditional dependencies.

Maybe I should have stepped back a bit instead of proposing a specific idea about paths. The root goal for me is to understand if there’s anything we can do to reduce the need to publish new builds for compiled extensions if the only relevant change between versions is the number itself. What would be required to change in Python (and/or pip) for me to build and upload an Extension-containing wheel for 3.10 and have the same wheel still be installable and importable with 3.11? That is, beyond the requirement that nobody proposes ABI-breaking changes in 3.11. If backward-incompatible ABI changes are expected to come every year, this may not be worth your valuable time to think about! That seems surprising, though.

Pure Python packages have been able to do this for some time, with py3 wheels, but compiled packages can’t. Pure Python (non-noarch) conda packages have the same issue, where it’s literally only the pyX.Y in the paths that requires rebuilds of pure-Python conda packages, so that’s where I was coming from. That’s potentially a solvable problem purely on conda’s side, though.

I would probably not propose changing the default installation path, but it seemed like adding a default ABI-based import path to sys.path so that a package system (such as conda) could opt-in to ABI-based installation without needing to modify or similar. That was my (possibly super off base!) idea, at least.

I’m confused. There’s no reason for a pure-python package to need a per-version build, even if it depends on a per-version package like pywin32. You can install the Python version alongside any pywin32 binary.

In wheel terms, you can install a .py3 wheel of foo with a .py37 wheel of pywin32 in your Python 3.7 environment, and that same .py3 wheel of foo with a .py38 wheel of pywin32 in your Python 3.8 environment. Is conda somehow different? In which case, this sounds more like a conda issue than a packaging one.

Is there a reason why you don’t want to use Py_LIMITED_API? It does almost what you want: sets supported version interval to [cpXY;abiX], though regrettably limits available API. E.g., see PySide2. IMO limited API is a small price to pay for such a feature.

Also, I don’t know about Unices, but on Windows built library is linked to python{MAJOR}{MINOR}.dll, meaning that the name python{MAJOR}{MINOR}.dll is hardcoded into binary file, therefore making support of potential future CPython versions kind of impossible (the only thing you could probably do is to try to edit your built library in a hex editor). To bypass this, you define something like -DPy_LIMITED_API or -DPy_LIMITED_API=0x030500f0 which results in your library to be linked to python{MAJOR}.dll which is shipped with Python since version 3.2 (makes cp3 somewhat confusing). I.e., I don’t think some path-based solution will help.

As for Conda… Well, that’s their issue. Pure Python wheels avoid this by simply specifying minimum supported version, with maximum version assumed to be set to infinity (I think).

P.S. Sorry, wanted to reply to Min RK…

Yes, this part is absolutely about conda. I tried to describe it above, but conda doesn’t have runtime-evaluated conditional dependencies, all conditional dependencies are evaluated at build-time by conda-build instead of conda (the installer). This can be a feature-request to the conda installer, though.

Py_LIMITED_API=0x030500f0 sounds like it could be exactly what I want, thank you! I will investigate whether it is suitable in my cases, and what’s needed to use it. There does appear to be a lack of documentation on how to use this. setuptools’ support is undocumented other than a mention that it was added in the changelog. I can’t find an example, tutorial, or recommendation about how and when it might be used. Maybe this turns out to be mostly a documentation request?

Again, the issue is really that lots of packages currently need to be rebuilt on each release of Python and it seems that there is more rebuilding going on than there needs to be. Pushing recommendations and documentation about how and when py_limited_api can be used, and where it fits into packaging best practices might help a lot! It seems like py_limited_api should be used by default by most Extensions except where it’s not an option.

Well, packaging is definitely one of the weakest points of Python and I don’t think that it will improve to a point that it will become easy. I had to write custom PEP 517 build backend and use CMake to end up with a somewhat clean solution (ATM CMake doesn’t support Py_LIMITED_API, but it should be possible to modify properties of target Python::Module to set paths to python{MAJOR}.dll/so/lib). IMO setuptools is too broken/weird/old to bother to do anything with it (it still generates Eggs, copies files multiple times all over the place, possibly because of wheel, makes hacking/extending major PITA, etc.).

I could probably write some basic instructions after finishing my own project, esp. considering that building of manylinux wheels with Docker doesn’t seem to be covered at all (still have to do this myself, but I finally figured out how to approach this problem). When finished, I’ll contact you by sending PM, but if it’ll take more than two months you can PM me instead because there’s a chance I forgot about all of this.