PEP 703: Making the Global Interpreter Lock Optional (3.12 updates)

A few things:

  • Due to CPython having a GIL today, there are no existing programs written for the high performance multithreaded CPython paradigm. So it’s not like playing it safe with specialization will be slower for existing multithreaded code. Your multithreaded code should also be faster if you’re not sharing too many variables between threads, regardless of whether specialization is working. Even if a variable is ostensibly read-only, writes to the shared refcount will likely still make it less efficient to use shared data than local data. So there are many reasons to encourage limiting the use of shared data, which has a bonus of making specialization easier.
  • This check would allow specializing all single threaded code for the nogil interpreter, as all variables will be thread local. I think the main concern, dropping specialization would otherwise be a single threaded regression.
  • This check should still allow specializing some threaded nogil code at low cost, which is better than the previous proposal of just disabling specialization for nogil.
  • This check may make specialization much simpler for nogil at first, with good options to improve it later.
  • Even unspecialized multithreaded code has a good chance of outperforming CPython with GIL + specialization.

Yep, I’ve also run into deadlocks recently where the GIL was one of the locks held. I strongly endorse this whole message.


Not necessarily, but neither are they expected to declare they are multithread-safe at all. A pure Python library may mutate global state without any locking, in which case the GIL will not make it magically correct in the presence of concurrent calls.

I understand there’s a difference for a pure Python application that also spawns threads internally. But the GIL doesn’t protect much in that case: you still need to use locks even for trivial things such as incrementing an integer.

(ok, @eric.snow explained me that GIL switching occurs only on certain specific instructions nowadays, but that’s more of a CPython implementation detail, and it’s very recent)


Same here, which is why I said “core devs” explicitly; a very niche, known group of people.

I know you’re quite aware of this, but there are folks that consume open source (i.e., users) and those that maintain it (i.e., maintainers). My comment was explicitly about the folks who will be asked to maintain this going forward, not the users who will (hopefully) benefit from it. And the SC is going to talk to core devs about this, so we are making sure this isn’t a blind spot in our decision-making.


Ah my bad, thanks for the clarification!

1 Like

As a “consumer” of open source, I’m grateful to Sam Gross for forcing this discussion. There’s been a lot of very technical discussion, but as an outsider, I can only say that a good deal of what has happened in this thread looks more like turf wars and motivated reasoning than a community trying to solve the problems of its “consumers.”

Python has succeeded beyond almost anyone’s wildest expectations, and outside the genius of the initial design, it’s not hard to see that a a great deal of its success has come from those who built up the ecosystem around it. This ecosystem now functions as a moat which other languages with the advantage of 2-3 decades of hindsight have nevertheless been unable to surmount.

Nevertheless, Python will someday fade in significance just as other languages have faded. For Python, the reason for its fade is unknowable, but I think I can say confidently that the average senior developer knows two things about Python: it is “slow”, and it cannot do parallel processing with shared memory. There have been many efforts to improve Python along the first axis, but the common knowledge is that Python is exceptionally dynamic and this flexibility will never allow it to be as performant as an AOT compiled language in this regard. The second axis, however has seen only two notable efforts, and the consensus seems to be that this is the first one that comes in at single digit single-threaded performance reduction.

The apparent focus of core developers on single-threaded performance is perhaps the strangest thing about this discussion. When (also open-source) competitors like Java are forging ahead with efforts like Project Loom, acknowledging that the greatest challenge to software engineers of the present (and future) is making efficient and comprehensible use of multiple cores as Moore’s Law approaches its end, CPython’s fixation on trying to squeeze the best possible performance out of a single core truly boggles my mind.

Core Python developers did not (and could not have) predicted the rise of NumPy, Pandas, and the various other libraries that have catapulted Python to the top of the heap for any number of uses for which it would have been an unfathomable choice even 15 years ago. It seems rather odd to assume that the Python ecosystem cannot evolve to meet the challenges of “nogil”, and not just evolve, but indeed open up dramatically new horizons for system-building. And claiming that the steering council somehow needs to know the answers for what higher-level abstractions could or would be built on top of free threading is a dramatic display of narrow thinking.

Ultimately, my thesis, which is borne out by the history of Python itself, is that core developers should be far less concerned about trying to predict the future, and far more concerned about opening up possibilities that were previously impossible, and letting the incredibly powerful ecosystem push things forward as it has done for a long time. Many of us in that ecosystem have been hanging on for dear life as we wait for Python to prove that it still has a future. But languages content to rest on their laurels (like all software) eventually get eclipsed by those that are willing to take risks and trust to the broader community to explore the space opened up by foundational improvements.

Perhaps my overlong post here is just so much rhetoric. But there are many of us (here I speak mainly for the scientific and data processing community) who so far have remained on the sidelines, commenting to each other on how disappointed we are in how long it has taken to move this forward. And seeing the opposition of a few long-time core developers in public has confirmed what we already feared.

To be brutally honest, a Python that leaves these types of dramatic evolutions sitting on the workbench gathering dust is a Python that, as far as many of us are concerned, is already on its way out. And we all know that when something is on its way out, its consumers quite often find themselves looking for the next big thing. :disappointed:


You are framing things in a way that I find a bit odd, as if our concern should be a fight for the survival of Python. If Python becomes insignificant it would be because someone has created an even better language. Surely that can only mean the world has become a better place.

Several great languages have been overtaken by newer languages. It would be an anomaly of history, a sign of intellectual stagnation in our field, if this never happens to Python. I hope people are working hard on creating even better languages than Python. I hope history has not stalled in its tracks. Freedom from the backwards compatibility constraints we need to deal with, and the ability to learn from the Python experience would, surely, help do that.

I don’t know what you refer to by “turf wars” - I assume sub-turfs within the python turf that you think we need to protect? These are not turfs. There are currently three large projects on the go, each trying to bring Python forward in a different way, and this discussion is trying to figure out what the tradeoffs are between them, where they are not compatible.

You complain that the core devs are not trying to “solve the problems of its “consumers.”” but you haven’t contributed to the discussion by mentioning a single concrete problem, other than some perceptual issues. Perhaps there is a perceptual issue here - it’s much easier to be frustrated at the time it takes someone else to make a difficult decision, than to make the decision and assume responsibility for the outcome. From your message, I doubt you and the many you speak for even understand how complicated this decision is.

Nice rhetoric though.


This has devolved into hand wringing over theoretical issues outside the scope of the PEP, and off topic discussion, rather than focused technical discussion about the PEP itself or the current implementation.

If Sam needs more feedback at this point, he’s welcome to open a new topic or to ping the mods to reopen this topic.

Others should reflect on how to be constructive, productive, and on topic in these types of discussions. Keep in mind that there is a long history of posts to review first, that lots of thought and care has gone into this already, and refrain from hyperbole and dramatics.