Is there an alternative to wheels, to avoid a big matrix of binary wheels?

I’m trying to make a package take less time for users to install, but it seems like pypi compatible wheels just results in creating a big matrix of versions and wheels, is there a good way to avoid this with other tools other than require users to compile it? (or better) Is there a way to avoid this problem with wheels?

You can reduce the build matrix in a few ways.

  1. Use and target the stable abi. Platform compatibility tags — Python Packaging User Guide If this is possible for your code, then this removes the python version component from the matrix.

  2. Limit your use of native code to things which can be accessed via ctypes or cffi, build the shared library per platform, and then include it in wheels with GitHub - jvolkman/repairwheel: Repair any wheel, anywhere

These are the two I’ve used before, others may have other options.


Thanks, I’ll take a look at both of these and if neither works, I’ll ask with more detail based on exploring.

I believe you can also use cibuildwheel to automate the building of the various wheels, if it’s just the amount of effort creating the wheels that is the issue.


Hi Liz,

Look into automation, and sdists if your package is Pure Python (no C or Rust or other compiled extensions)

Firstly there are lots of Github Actions, and other CI tools from other platforms, to largely automate the process of publishing to PyPi. You can even store PyPi per-library secrets as per repo secrets, and so never use your PyPi account’s master key. This is the official one for PyPi: GitHub - pypa/gh-action-pypi-publish: The blessed GitHub Action, for publishing your distribution files to PyPI:

When you’re starting out and figuring out how they work, for side-effect free experimentation, you can set a Github Actions to on: workflow-request and have it require you to click a button instead, so you don’t have to worry about commit tags and PRs from on: push

Is this a C-extension, or are there a lot of bundled dependencies? If it’s pure Python, creating source dists is easy, just by using hatchling and following the tutorial. It’s probably not recommended for various reasons, but it’s clean and straightforward. I really like it. You’d only then have a list (a single new file for each version you issue) not a matrix to manage.

I’m not sure that will reduce the time required for an install from your users unfortunately, but it’ll make your own life easier at least.

If it’s pure Python, you only need one wheel, not one per platform and Python version. An sdist + a wheel is what the tools will generate by default. You should upload both of them to PyPI.


scikit-build-core has Python Stable ABI support, which can take the “version” factor out of the matrix.

This worked well and allowed me to cross compile using zig, further reducing the amount of CI runners I needed.

Right now, the things needed to make a native python module have a lot of macros that don’t work well for cross compiling. Some of that seems to be on the zig cc side, but reading some of the macros it looks like some of these macros are called in a way that causes issues with side-effect duplication. I can try and raise the issue with zig if I track down why this is failing later.

Thanks, I’ll take a look at this later, I’d like to compare it to what I got working today. I can imagine needing something that may not cross the cffi boundary as easily in the future.

For anyone finding this later, the information provided was useful, but I left out that it led to me discovering a bit more neccessary steps. RPATH in the wheel needed to be set as well. This is handled automatically by the mature tooling for wheels that exists, but I had a reason to not use that.