Towards standardizing cross compiling

Would this be a good time as any to do this? This has been a big thorn in my side as well. At the very least it should be possible to have an Informational to reserve namespaces for specific usages (a la tool tables in PEP 518).

1 Like

I think so. The thing to check is whether there are any “official” settings already in use, or any common setuptools ones, particularly those that might be intended to bleed into any/all packages in an install, rather than just one.

It seems like a reasonable assumption that if values are being ignored by a backend, the user is going to be less than happy if it’s silent. Any other ways we can test this, though?

1 Like

Strange. I could have sworn everyone called that a target. Hearing host, I could very easily read the opposite meaning into it: The host being that which hosts the compiler.


It comes from GNU autotools, which CPython uses. When you cross-compile CPython, you specify --build and --host, and the terms are used internally in a couple of places. target has a different meaning in GNU autotools too.

I put that there just so we’d all be on the same page. I’ve found that mixing host/target terminology with build/host terminology or anything else, things get really confusing really fast. (You’re right that most people use “target.”)

1 Like

I think the most realistic thing we can hope for is a standard sysconfig “dump” tool that can be run on the host platform to produce all the information needed to build on the build platform.

Bit late to respond to this; a standard tool to do this would be perfect for PyO3. In particular, finding an extension to PEP 517 only really helps with cross-compiling python packages. If users are trying to cross-compile a Rust program with embedded Python, then PEP 517 is not relevant.

1 Like

Reviving this thread: I have a draft PEP for some of the issues discussed here at Draft PEP9999: Standardized Config Settings for Cross-Compiling by benfogle · Pull Request #1 · benfogle/cross-compile-pep-draft · GitHub

Feedback is welcomed, and let me know if this should be spun off into its own thread.

1 Like

This combined with zig cc could make module cross-compilation relatively simple.

Unfortunately, the compiler isn’t the hard part. It doesn’t seem to be in this thread, but we definitely looked at zig back when discussing this last time around. Communicating to all the build backends that they should cross-compile, selecting the machine they should be compiling for, and helping them find the headers, libs and options they need to build is the hard part. Those are all Python-specific.

Sorry, I missed this when you posted it - I think I was already on holiday by then. I’ll take a look this week.


Somewhat tangent, I explored using Zig to build extension modules a while ago:

Unfortunately I have not found a good use case for this. Hopefully someone will if I can manage to remind everyone well enough!

Not sure about a use case specifically for packages building themselves, but it does sound cool.

My use case for cross-compiling more generally is to be able to build Docker containers locally on macOS (or other non-linux) without a VM. We use Bazel to build software - including Docker images - and unlike standard Dockerfiles which are basically scripts executed in a container (and on macOS, within a VM), Bazel creates the proper tarball structure using artifacts it’s built previously.

Bazel has pretty good support for cross-compilation, and for languages like Go or even C++ with bazel-zig-cc, the cross compilation can “just work” and the docker image produced on a mac will execute on a Linux host. But for Python, Bazel will just naively dump packages with Mach-O binaries into the container which obviously won’t work on Linux. Cross-compilation for Python packages would help with that.