Consider downgrading Windows 32-bit from Tier-1 to Tier-2 or Tier-3? (in Python 3.13?)

OK, but I’m talking specifically about my experience in consultancies for which individual consultants can use “whatever helps to get things done” but corporate support is only for specific “core products” - typically big ERP systems and similar. In those environments, core IT policy can be fairly weird (and yes, it’s often dysfunctional). I’ve never seen a case where regular installers were blocked (and I can’t imagine such a policy working in that environment). Blocking the Windows Store is typically part of limiting the base Windows OS build, but consultants generally have broad capabilities over their own PCs - so installers work fine, as long as you can download them via the (centrally configured) browser. Almost no “development tools” are provided centrally - even getting a text editor other than Notepad can involve finding and installing your own :slightly_frowning_face:

These sorts of environment are a major example of where Python (as an automation and data analysis tool) can provide significant added value (I know, I’ve given internal presentations with precisely this message) while being almost completely unsupported by the IT policy.

The target audience here is consultants who will simply ignore Python if it’s hard to set up. They’ll use R, or even Excel, for data analysis, and shell scripts or Perl for automation. Instead of the message being “Python is a great tool” it will be “data analysis is great, here’s R Studio or Excel” or “devops is the new big thing, and it’s about writing shell scripts”.

Personally, I think that getting Python into the hands of users like this is essential, and having a simple and effective “download and install” way that people can just get on with using Python is key to that. And yes, I imagine you’ll respond with “use conda for that”. I tried. People weren’t interested, because too much example Python material on the web assumes pip/virtualenv/PyPI and my target audience can’t translate that into conda terms without help that they don’t have the means to get. Also, do we really want to go back to a situation where Windows users feel like second class citizens, because all of the instructions written for and by Unix users that discuss virtualenv, pip and PyPI “doesn’t cover Windows, which uses conda”?

(Evangelism mode off :slightly_smiling_face:)

I agree we already have too many options. But otherwise, that does indeed sound like a good solution. Unfortunately, I no longer work at my previous employer, so I can’t experiment to see whether MSIX installers are blocked along with the Store, but I suspect not, based on experience.

I agree with Steve that this is a process issue in the company, but it is something I’m very familiar with personally and anecdotally talking to others, it usually goes something like this:

  1. Central IT department are given requirement to lock everything down
  2. Central IT block as much as they can until they are told it’s an unworkable environment, e,g: Windows Store, many other windows features, allow listed Internet access, running arbitary executables, etc.
  3. The majority of users can manage this but those requiring development tools can not
  4. No policy is in place to handle this user set, so instead they get exceptions to install their own tools
  5. These users are left unable to access useful Windows dev features like WSL, Windows store etc., but they can manually download and install Python from python.org
3 Likes

At a previous company 32 bit Python was popular (and the main used version). They would basically copy paste the Python directory from host to host (starting many years ago).

Certain teams would write tools and say they only validated that Python directory, others would close bugs as won’t fix if another Python (or even x64 Python with the same version) was used.

I could picture that same group of people refusing to ‘validate’ a x64 Python today and hanging on to 32 bit, even if they decided to update the golden folder to ‘minimize the impact of change’.

I would imagine plenty of other enterprise environments have similar experiences.

I have encountered some secondary schools where they had to use older versions of Windows (such as Windows 7 or even XP) due to outdated hardware. In such cases, their administrators tended to opt for a complete 32-bit environment, including CPython, to reduce memory usage and ensure the system could barely meet the teaching requirements. They also preferred older Python versions like 3.6 or 3.7.

An important point that can be illustrated with examples is that older environments are quite common, and the software running on these outdated and restricted versions tends to be equally outdated.

As for CPython, version 3.11 is already very new, and users who can use this version rarely lack a 64-bit environment. In most cases, it’s just a coincidence that they are using 32-bit CPython. I believe these situations should not hinder the rapid iteration process of CPython.

1 Like

Here’s a PR to make macOS (Intel) required: python/cpython#110362. I haven’t made Windows (x86) required yet, some flaky tests need sorting out first.

Please wait for 2 to 3 weeks, I’m actively working on fixing tons of bugs which seem to be more likeky on, if not specific to, Windows. I changed the CI recently to make it stricter (any failure now mark a whole build as failed, even if the test pass when re-run), and unstable tests are a pain. Well, last days, the number of test failures “specific to Windows” is decreased. Now most platforms seem to be equal in term of failure rate :slight_smile:

1 Like

Please see test_signal.test_stress_modifying_handlers() crash with SIGSEGV on GHA macOS (macOS-12.7): test_signal: test_stress_modifying_handlers() crash with SIGSEGV on GHA macOS (macOS-12.7) · Issue #110083 · python/cpython · GitHub

FYI I fixed the test on Windows x86: [3.11] gh-108851: Fix tomllib recursion tests (#108853) by vstinner · Pull Request #109013 · python/cpython · GitHub

Am I correct in understanding that we are choosing to keep Windows 32-bit as first-tier?

Yes that is correct.

1 Like

Darn :sweat_smile:

Okay then, I guess I’ll have to start building wheels and installers for that platform again

Windows 10 will reach end of support on October 14, 2025.

that’s the last 32-bit version there is.

2 Likes

Until Windows drops support for running 32-bit binaries, we don’t consider it dropped (or when they add support for loading 64-bit DLLs from 32-bit processes, which is very unlikely). A 32-bit only OS isn’t the requirement.

5 Likes

I’m not sure I agree with the above (ofc I’m not a core dev).

I accept the argument that someone could keep publishing their own 32-bit binaries that link to cpython library. And they can continue for a long time, I don’t think there’s ever been a plan or announcement from Microsoft to even consider dropping 32-bit apps, as opposed to e.g. Apple that switched to 64-bit only circa 2019.

At the same time, I’d imagine that Python is used as a binary a whole lot more than it is embedded in other binaries. I’d argue that should be a priority. 3rd party library users have a way forward, they could ship 64-bit binaries, and I think they ought to recompile their binaries whenever there’s a header-level or API-level change in cpython.

In other words, the argument about Windows support for running 32-bit binaries seems to boil down to the use-case where there’s a 3rd party 32-bit binary, dynamically linking a 32-bit python lib, and the intention is to swap a newer python lib in without binary being recompiled.

That seems like a very narrow use-case.

1 Like

You can dispute the use cases, but please provide some actual benefits of dropping support, given that we can’t drop support for 32-bit entirely.

All the proposed benefits, other than being allowed to merge bugs, have been disproven throughout the thread.

2 Likes

Honestly it also pains me that the _freeze_module project when using the default configuration from build.bat that it seems to build for 32-bit unconditionally. The problem is that I want to have it build ONLY for the selected arch that everything else is building for instead. Which means for ARM, ARM64, or even AMD64 builds there would be no 32-bit folder named win32 and I could easily detect what arch of python was built (whether it was win32, amd64, arm, or even arm64) from msbuild when my custom embed interpreter builds. This would allow me to copy the files accurately depending on what arch I am building my embedded interpreter for. Currently the only accurate way is to:

  • manually add the pythoncore project’s files into my embed exe project file.
  • Figure out a way to manually copy ALL OF THE exports from the normal pythonXY.dll to a dummy stub one that redirects to that exe (to allow loading c extensions from user site-packages folder without failure).
    • I tried using a python script similar to Tools\build\stable_abi.py but in a way that scans every header except pyport.h for the data and abi macros to filter the 2. It works mostly except when it screws up the regex from usages of (fixing these would make my use case work 100% of the time):
      • immediate line terminations after the macro
      • the _PY_NO_RETURN macro’s usage following that macro (honestly why not use it right before that macro).
    • An alternative to the above would be a toml file for ALL of the exported symbols from pythonXY.dll that one can use to generate their own pythonXY.dll stub for use in embedded interpreters that manually or static links in the core itself into the exe (just to allow them to use c extensions without it having issues finding exports or w/e).

For my use case this has become an unbelievable pain and is annoying.

It follows PreferredToolArchitecture, which is also used to select the compilers and other “native” tools required to build. Set this (env var or MSBuild property) to x64 or ARM64 and it’ll use the tools for that architecture instead.

(You can find this by reading PCbuild\pcbuild.proj, which is the top-level project used to build when you’re not running from within VS.)

1 Like

Personally I think it should instead follow how all the other projects select the arch by default so then when someone right after cloning the source code would have to run build.bat and everything would be in amd64 folder (win32 never created if they have amd64 as main arch on their pc) so then it makes my issues much less of an issue. Also it is easy to forget to set environment variables as well and I honestly prefer as few of them as possible for that exact reason (as sometimes they might not produce reproducible builds).

This prevents cross-compilation. You can compile for ARM64 from an x64 machine because the tools architecture isn’t the same as the target one.

I wouldn’t be opposed to having the non-releasing tools (just _freeze_modules right now) build into their own directory, but I won’t accept having them use the target architecture rather than the current one.

1 Like