Python is also not standardized, and yet I don’t think any of us believe it follows that it’s premature to use it
. Can you expand a bit more on why you think standardization should play into this?
That’s a good question!
But I guess the real question here is: what problem does standardization actually solve?
Android and Linux seem to be doing just fine with Rust, even though it doesn’t have a formal standard.
Here’s an article from Mara (a member of the Rust Leadership Council) that discusses this topic: https://blog.m-ou.se/rust-standard/.
The distinction is that those are implemented using I-unsound-tagged bugs in the compiler (and no comparably advanced optimizing compiler is completely free of them) and the underlying formal proof is for “if all compiler bugs are fixed”… similar to how you shouldn’t fault a language for the underlying DRAM being susceptible to Rowhammer.
I don’t have the URLs on hand, but, if I remember correctly, GCC and LLVM have equivalent tags in their bug trackers.
In this context, the reason some of those bugs are long-lived is twofold:
- The developers have determined that they’re very difficult to encounter accidentally.
- Neither the rustc devs nor the LLVM devs nor the GCC devs nor any developers of optimizing compilers are willing to take on “this transformation is a security boundary, fit for processing mailcious inputs”-level responsibility.
Rust has multiple mechanisms for building without access to Crates.io, depending on the specific circumstances. For example:
cargo fetchandcargo build --offlinecan be used to separate the downloading and building while otherwise using Cargo the same way.cargo vendorcan be used to vendor the dependencies without losing the information that something like cargo-audit would need.- The Overriding Dependencies section of the Cargo Book covers things like overriding the Crates.io repository URL to locally point a package at a different source.
- While I haven’t kept up on the state of the art, it’s possible to run a local mirror of Crates.io more broadly using tools like Panamax.
I think that covers all the major tiers of the problem.
I think in terms of a tech stack, as you go further down you want to be increasingly conservative and prevent any breaking changes. This has been the success of Windows, which for all it’s faults is quite backward compatible. I can take code from 30 years ago and it will run. Similarly for the Web. I can look at archived pages from decades ago and it will display.
Python is the foundation for many projects and businesses. I trust that code I write today will run in 1 or 2 or 5 years. 10 years, less trust. Lessons learned from 2to3 transition.
In turn, C is the foundation of Python. I trust that any changes to C will not impact Python and my investment of time, etc. won’t be at risk.
The core issue is trust. There is “currency” in trust. Python has a healthy bank account of trust, and I fear it will be at risk.
Generally speaking, standardization tends to come into play for one of two purposes:
- Re-unifying disparate implementations of a language (C, C++, ECMAScript, etc.)
- Making a proprietary product look more appealing to enterprise or government decision-makers (Java, .NET, Office Open XML, etc.)
Given that Rust’s regression suite and v1.0 stability promise already pin the language down more thoroughly than C or C++ and that gccrs plans to follow rustc as the source of truth, I’m not sure a standard would have much benefit here.
(Seriously. Look into how much about C is left implementation-defined. We generally greatly overestimate what the spec actually calls for. That’s one reason you tend to see big projects picking one or maybe two compilers per platform and coding against those. For example, the Linux kernel is written in GNU C and the ability to compile it using llvm-clang was a little bit about retiring use of features the kernel devs had decided were mistakes and overwhelmingly about teaching llvm-clang to support GNU C. …it also has its own non-standard memory model that only works because GCC is careful not to break it.)
See my other answer, regarding trust. Having a standard provides some measure of trust in a technology that is a foundation of a project.
I know nothing about Android, but I read that Ubuntu had a problematic release with their porting of uutils/coreutils to Rust in 25.10. Those tools are a foundation to a Linux system. This undermines trust in Ubuntu. I’d hate for the same to happen to Python.
I don’t know that there is a level of standardization or certification that can satisfy every vague concern.
Personally, my trust in Python was broken as soon as I saw lines in the standard library docs saying things like Deprecated since version 3.6, removed in version 3.12. (Specifically the latter half.)
It’s one of the things that made me feel relieved that I’d decided to work on a Rust rewrite for any code that doesn’t need memory-safe QWidget bindings, Django’s ecosystem, or Django ORM/Alembic draft migration autogeneration in order to minimize the “It works. Don’t **** with it” vs. “Burned myself out again trying to reinvent a stronger type system in my test suite” factor.
Love this effort, a strong +1 from me. I’ve been historically wary of bigger CPython contributions because I don’t know C, and don’t particularly want to know it. Rust is a completely different matter.
A lot of the introduction here is aimed at Rust as a replacement for the C parts of CPython. But I think Rust could be a huge win for optimizing Python parts of CPython. “Rewrite module X in C” usually means a significant effort, both up front and on-going, maintenance-wise. Rewriting in Rust could be a completely different story, if we do this right.
Maybe a standardisation to the effect that code from The Rust book, 1st edition, or 2nd edition, still work with the latest Rust. (Perhaps it does).
It should. See Stability as a Deliverable for a description of Rust’s “v1.0 Stability Promise”.
Basically, so long as you’re not depending on a compiler bug or security hole, the only thing which should be allowed to break vN code in any later vN+M version of the Rust compiler is the occasional change to how type inference works in edge cases.
…also, I almost forgot to mention this:
Ubuntu’s troubles with uutils are, in my opinion and in the opinion of others, self-inflicted. The uutils devs are quite up-front that they haven’t yet achieved their goal of passing all the tests in the GNU Coreutils test suite, so Ubuntu trying to use them is similar to all the distros that made a mess by ignoring KDE’s announcement that 4.0 was meant to be a developer preview.
How should the Python community quantify this trust, given that your original metric (standardization) doesn’t apply to Python itself?
Conversely: do you moderate your trust in CPython based on the presence of unstandardized, compiler-specific extensions? The last time I checked, there were a nontrivial number of GCC extensions and attributes in the codebase (other compilers go to great efforts to be compatible with these, but they’re not standard).
Basically, so long as you’re not depending on a compiler bug or security hole, the only thing which should be allowed to break vN code in any later vN+M version of the Rust compiler is the occasional change to how type inference works in edge cases.
Use of unstable features in crates is more common than I’d like it to be still, and the promise does not apply to that. CPython should avoid any use of them.
Rust for Linux currently relies on some such features, though they’re making an effort to stabilise the ones they’re relying on and not introduce more.
cargo fetchandcargo build --offlinecan be used to separate the downloading and building while otherwise using Cargo the same way.
That option would require some work besides git clone which may not be desirable. It does bring up the general question of whether CPython would want to aggressively use crates (which can bring licence questions too) or not.
CPython currently has a pretty small set of external dependencies.
Fair point.
I haven’t used nightly for anything but the occasional nightly-only tool run (eg. Miri) in at least five years, but then I don’t do kernelspace stuff and using Rust for microcontroller hobby programming is still on my TODO list.
My experience has been that there isn’t much call fornightly for cargo build-ing userspace projects anymore.
I’d imagine cargo vendor would probably be a better fit for that. Beyond that, cargo-deny is good for enforcing policy on dependencies (licenses, security advisories, etc.) and cargo-supply-chain helps to automate the process of inspecting who you’re trusting, independent of how many pieces they decided to split their project into.
I wanted to start by thanking everyone for their feedback on the proposal so far, and say that we look forward to continued discussion.
After reviewing the discussion so far, we’ve decided to re-focus the (pre-)PEP to only propose the introduction of optional Rust extension modules to CPython. We hope that with experiences gained from introducing Rust for extension modules, Rust can eventually be used for working on the required modules and the interpreter core itself in the future. However, we will leave that to a future PEP when we know more and will not be proposing that as part of the current in-discussion PEP.
This should address issues with bootstrapping, language portability, and churn.
We’ve also been noting lots of other feedback we’ve received, but I wanted to call this one out in particular as it has been the source of a large portion of the discussion.
I would, but the moderation in question is relatively slight. There are two levels of trust: “Do I believe this isn’t malicious?” and “Do I believe that this is able to do what it promises?”. The compiler-specific extensions don’t significantly affect the first one (any sort of malicious implication has to be incredibly convoluted, like “the CPython devs are trying to force people to use GCC because they are trying to boost Richard Stallman’s fame and try to get him into the Guinness Book of Records” - or something equally ridiculous), though they do have an impact on the second (“in the event of a problem, do we have true options here?”). So, yes, it does impact trust, but not all THAT much.
Non-standard/compiler-specific features, to me, recall the days of IE-specific features in web sites, which had the much-less-convoluted justification “Microsoft wants everyone to use IE so they have to buy Windows”. But the trust impact depends on how viable the threat is.
I think the timeline you are laying out here is a bit too certain. I would instead propose a timeline based on the adaption of Rust within Python.
-
In Python 3.15,
./configurewill start emitting warnings if Rust is not available in the environment. Optional extension modules may start using Rust. (Same as PEP proposal) -
Once the Rust has enough usage within Python extensions a PEP will be created with a timeline of making Rust mandatory.
This is no longer the current plan, please see Pre-PEP: Rust for CPython - #117 by emmatyping
I don’t think we should emit a warning now that these items will be entirely optional and we don’t have a plan for making Rust required. When that is proposed in a PEP, the timeline for emitting warnings and requiring Rust will be decided there.