IIUC, I think by 32-bit, @mattip is referring to x86 in particular. @steve.dower has provided the answer for Windows, and macOS doesn’t support x86 anymore, so I just want to chime in that most distro supporting x86 (and other obscure architectures) have their own CI systems for downstream testing and building.
Ideally, it would be great to warn any embedders who are including them in 32-bit apps, though I know those are hard to find. Chances are they won’t easily be able to just switch their app to 64-bit, and if they’re dependent on a numpy stack it could come as a surprise. (That said, most embedders seem happy enough to be a few versions behind anyway to keep the stability. So likely not going to have any painful blowback if they find out with everyone else.)
Have you checked the download stats for the existing 32-bit wheels?
There’s also the fact that the default Windows download has been 32-bit for a long time, so the number of Windows users with 32-bit Python will be artificially inflated. 3.9 is 64-bit by default, so things are easier there. But for 3.8 and earlier, dropping 32-bit builds of new project releases would impact all of those 32-bit users, even though they might be perfectly able to run 64-bit Python.
That will hopefully change soon:
Even so, plenty of people want Python for truly native things, and we want to offer it.
Though it would certainly be easier to tell people to use x64 if they want a Python data stack, and use ARM64 if they want a native Python runtime.
Reviving this topic, as building NumPy/SciPy wheels for 32-bit windows is getting more painful. I see that windows-arm64 can now use x86_64 emulation. Is there a way to check how “popular” 32-bit windows is across the python ecosystem?
SciPy has dropped support since 1.9.2, released Oct’22. 1.9.1 was the last release to have 32-bit Windows wheels. SciPy received few complaints, nor am I aware of many complaints when Anaconda dropped 32-bit support. So I think it’s safe to say that it is very niche by now.
In Fedora, we build Python packages with extension modules on all supported architectures. This includes i686, which is not available as an installable Fedora Linux, but exists to support the infamous “multilib” scenario (so users can install glibc.i686 and friends on an x86_64 system to run e.g. Stream, Wine, or some legacy app that is shipped as an ix86 ELF).
While I don’t expect users need multilib SciPy or NumPy, the problem arises with (very deep transitive) build dependency chains. E.g. there is this dependency chain:
- (everything) needs rpm needs
rust-packaging is written in Python and build-requires
- scipy …
- pandas build-requires
- hypothesis build-requires
- pytest build-requires
- rust-packaging is written in Python and build-requires
- rust-rpm-sequoia build-requires
(There are more such chains.)
While most of the chains are usually “noarch” (i.e. does not need to be built separately on each architecture, e.g. it is a pure Python package), such packages can no longer be built the same way on i686 – and our assumption is that “noarch” packages can be built anywhere.
I suppose with the trend we will need to figure out how to deal with this problem on our side, I am just mentioning this for completeness. We don’t require wheels built for 32bit, but we need the software to be buildable and testable on i686 (at least for now).
I started the discussion in Fedora as well: How to drop 32bit support from the scientific Python stack - devel - Fedora Mailing-Lists ↩︎
Thanks for the context @hroncok. I agree that one needs to be able to build from source on i686. There’s also
armv7 which is still quite relevant.
I suspect that this issue is mostly meant for dropping 32-bit Windows wheels, rather than anything else. @mattip if you agree, perhaps you can update the title of this thread?
Finally, I’ll also note that 32-bit Windows packages can be built from source just fine. The key problems are (a) no compiler toolchain for SciPy and other Fortran packages, and (b) 32-bit Windows is very niche and no longer worth supporting through wheels for any project where that is significant effort. To me that is a different question though than whether or not from-source builds work. It not doing so should be considered a bug, just like for any other niche platform that we don’t have wheels for.
Using pypinfo to get download numbers for NumPy files on PyPI over the last 7 days (max 1,000 files), and then filtering for Windows for the last release:
pypinfo --days 7 --limit 1000 numpy version file system > 1000.md rg Windows 1000.md | rg 1.24.3
That’s 98.0% for 64-bit Windows and 1.6% for 32-bit Windows (ignoring the sdist and manylinux).
(Full output of pypinfo: Output of `pypinfo --days 7 --limit 1000 numpy version file system` · GitHub)
Looks like things will get even trickier in the future because we need the old 4.0 version of rtools installed in the workflow script, and since 4.3 is installed by default:
30Installing the same package with multiple versions is deprecated and will be removed in v2.0.0.
Debian’s situation is similar to Fedora’s in that we try to build all our packages on all our supported architectures. That currently includes 3 32-bit architectures:
i386 (32bit x86),
armel (ARM EABI 5T+),
armhf (ARMv7 with hard-float), and
mipsel (32-bit little endian MIPS).
Unlike Fedora (if I understand correctly), arch-all packages (equivalent to Fedora’s noarch packages) are built in an
amd64 environment, by default. So, dropping 32-bit support for a package doesn’t have as much distro-wide fallout.
It does get annoying when low-level tools that are widely used drop support for an architecture, and take down entire trees of packages with them. But that mostly only affects the users of the obscure architectures. And the package maintainers who have to look at failing builds/tests. In some cases we can just ignore an upstream’s desire not to support an architecture, and continue to build a tool (maybe without tests). That can be good enough, or a cause of even more headaches…
Thanks for all the replies. I would summarize that it is OK if NumPy (and the scientific python stack) stop testing 32-bit windows and stop providing 32-bit windows wheels. We (NumPy) will continue to test (but not provide wheels for) 32-bit linux.
FWIW for pyca/cryptography we had the opposite experience: We stopped testing on 32-bit Linux, and stopped building 32-bit Linux wheels several years ago, and that produced no problems.
However, of our Windows wheels, nearly 15% are still 32-bit, so we haven’t dropped support yet. (Though I would really like to.)
Maybe the download statistics represent some CI workflows that pick up cryptography as a transient dependency, and do not really care that they are getting the latest version.
It’s possible! One other factor is that for many years, python.org defaulted to 32-bit for downloading Windows installers, so many people were using a 32-bit binary despite having a 64-bit CPU and operating system.
Sharing some numbers for Pillow (from python-pillow/Pillow#6941 (comment)):
For Linux, the
*i686.whl numbers are very low for all versions (0.01% - 0.09%):
*win32.whl are a bigger proportion but still low (0.90% - 5.63%):
In next month’s Pillow 10.0 release, we’re going to skip 32-bit wheels for Python 3.12, and I think it might be a good time to drop 32-bit wheels and testing for all versions.
Update: in Pillow 10.0 (2023-07-01) we decided to stop providing 32-bit wheels for all platforms.
But we kept compatibility and are still testing 32-bit for the time being (Debian, Windows). We had a couple of issues opened, but no real pushback.
For h5py, we dropped 32-bit wheels on all platforms with h5py 3.0 in 2020; I think we also had one or two issues about it, but no serious push to restore them. There was more interest in adding ARM-64 wheels (which has now happened).
h5py is probably in a particularly favourable niche for this, though - a lot of the scenarios where people use HDF5 go along with powerful computers with plenty of RAM, which pushed people towards 64-bit long before we dropped 32-bit.