PEP 703: Making the Global Interpreter Lock Optional

Yes, exactly. So what happens to existing calls like gc.collect(generation=1)? Do they start raising an exception?

One of the ideas that PyCXX implements is that you do not need to know the hard-to-get-right aspects of the python C API.

Borrowed refs is one hard-to-get-right that PyCXX makes go away.

In the same way I would like to make any unsafe calls to go away.
Its hard-to-get-right to know if you can or cannot use the PyDict/List_GetItem in my view.

I would like a way to make them raise compiler errors so that I can avoid ever needing to figure out if its safe or not. I’m happy to #define PyGIL_UNSAFE_API_COMPILER_ERRORS to make this an opt in feature.

I have been thinking about how I could have PyCXX allow a single binary extension to work --without-gil and --with-gil.

The ABI for the both builds of python would need to support a common set of operations that work in both builds.

It think for this to work python would need to support:

  1. always have the object header allocate the extra fields used for --without-gil
  2. always use the function version to do INCREF and DECREF so that the python.so/.dll can have the right version for how python was built.
  3. never use the Py_INCREF or Py_DECREF macros.
  4. Any macro that uses Py_INCREF or Py_DEVREF would need to use the function version that I would opt into via a #define
  5. the other 10 things I have not considered…

As defined the Py_XINCREF and Py_XDECREF are inlining the macro contents so not suitable as is.

1 Like

Small question: should the design be, technically, roughly as performing on windows than on other operating system ?

I’m curious what people think is a acceptable slowdown for single-threaded code?

My estimate for the performance impact of this PEP is in the 15-20% range, but it could be more, depending on the impact on PEP 659.

Personally, I don’t like the shared memory model of concurrency, so any slowdown is too much. But that’s just my opinion.

5 Likes

On my personnal test, slowdown was 40%, but the antivirus of windows may be guilty.

1 Like

We can’t actually test the performance, since the nogil branch is a fork of 3.9 with extensive modifications to counteract the slowdown of the extra looking and reference counting overhead.

Consequently, comparisons to 3.9 are meaningless, and because the nogil branch lacks the improvements in 3.11, comparisons to 3.11 are also meaningless.

3 Likes

Was comparing python 3.9 to nogil python 3.9, on windows. Unfortunately, there is no benchmark done for windows on CI systems, nor on GitHub - faster-cpython/benchmarking-public: A public mirror of our benchmarking runner repository

To assess how it reacts on ‘user’ Windows, measuring it (roughly) would be nice.

Very cool proposal! As an average Python dev who sometimes had to work with concurrency and parallelism, I ask this (maybe obvious) question: If this get implemented, why would we devs ever use multiprocessing over multithreading?

2 Likes

If you have the problem of a long running program growing in memory size then it is useful to kill off a worker process and spin up a new one.
I work on an service written in python that uses this technique.
The service runs for months at a time without a restart (only restarting when we roll out an updated code).

You cannot do that resource management with threads.

2 Likes

This is such a big change and breaks the backward compatiblity, to me it is very similar to what python community has done when moving to python 3.

looking back, 2 to 3 was such a slow (and maybe painful) move due to the incompatibility. but i believe the community has learned so much and i believe this time we should be able to handle this transition much smoothlier.
I would think it is better to kick off python 4 for this pep. we can have python3 and python4 in parallel just like we did for python 2 and 3.

Developers have learned so much from last transition, and i think we will be in a much better shape if we do this again. it is a much cleaner way for doing such things, although it may appear there is more work to be done.

2 Likes

I think it’s possible to introduce this without the needing to support two ABIs.
Maybe that is a necessary goal for this to be accepted?

A lot of people seem to compare this change to the transition from Python 2 to 3. I might be good to remind the readers that the GIL removal has very little chance to break a Python-only codebase, unlike the transition from 2 to 3.

The breaking change impact seems an order of magnitude smaller if I read this document correctly.

13 Likes

The concern is that this is a breaking change for extension maintainers.

1 Like

Distribution

This PEP poses new challenges for distributing Python. At least for some time, there will be two versions of Python requiring separately compiled C-API extensions. It may take some time for C-API extension authors to build --without-gil compatible packages and upload them to PyPI. Additionally, some authors may be hesitant to support the --without-gil mode until it has wide adoption, but adoption will likely depend on the availability of Python’s rich set of extensions.

To mitigate this, the author will work with Anaconda to distribute a --without-gil version of Python together with compatible packages from conda channels. This centralizes the challenges of building extensions, and the author believes this will enable more people to use Python without the GIL sooner than they would otherwise be able to.

What is the recommendation for distributors other than Anaconda? Are we encouraged to build and offer a separate nogil build of Python (similarly to how we currently build the debug build)?

4 Likes

After reading through the PEP, I stand by my initial reaction that this is exciting and impressive. However, I agree with those saying that the compiler option is problematic. My blunt summary of this PEP is that it proposes to introduce a second “officially blessed” Python version, like PyPy or micropython. While the PEP makes the desire for multithreading clear, does it need to be part of CPython?

What if it instead was introduced as a another distribution, potentially called PyNG (for NoGil or “Next Generation”)? I think, likely naively, that it would make it easier for Python distributors to separate the projects and it would help package maintainers advertise the scope of their package.

Furthermore, PEP 690 (Lazy Imports) was introduced for similar reasons (improving performance for a subset of Python applications), with a similar opt-in mechanism (run-time flag -P), and it was rejected due to concerns that it would put significant burden on package maintainers. I think this PEP would do well to address PEP 690 and it’s rejection in the Related Work section.

5 Likes

Also, what will be the process for the builds distributed by python.org? Specifically, will people using the standard installers on Windows (or “casual user” distributions like the Windows Store distribution) have access to nogil builds outside of Anaconda?

To put this another way, will nogil be targeted as a “specialist” option, which you should only be using if you have a specific need for it (e.g., scientific/ML use, which is what Anaconda targets)?

4 Likes

I understand that the compiler option is to maintain a single cpython branch , but otherwise as in doubts of indirect effects, explicity would be nice:

Python-3.12…3.99 for gil , and Python-4.12…4.99 for nogil are released in parrallel

My_wheel-1.2.3-py3py4.whl

… universal wheels or not , it’s made explicit, no guess work, no need of dedicated site, … and has long that nogil is not estimated mature, you release binaries with the permanent rc tag.

It would handle the situation where nogil for linux is ready, and makes an environmental impact, but not nogil for windows or wasm

1 Like

It would also break Python-only codebases that (directly or indirectly) depend on GIL-only libraries.

2 Likes

I would forgo 10-20% slowdown from 3.11 (and perhaps even 3.10) speeds for single-threaded code.

I became a programmer at the tail end of the Python 2 line, so with the following two caveats that: A. memory is fading, and B. I was not responsible for huge amounts of legacy Python 2 code, the following things seem worth recalling about the 2 vs 3 split.

  • For quite a number of years, it wasn’t possible to run the same codebase in v2 and v3; six and later Python 3 versions eventually changed that, but still it did require modifying the Python parts of the codebase.
  • For quite a few years, many programmers didn’t realize a clear advantage to moving to Python 3, especially for those who didn’t have unicode problems. Rather, one had annoyances, such as print becoming a function, and your code getting slower than 2.7.
  • The ergonomics of Python3.5+ really made the upgrade unquestionably worthwhile (at which point Python 3 was ~7 years old.)

The incentive dynamics are different now, as I see it, at least.

  • Running multiple interpreters for some part of the workload is not as difficult if both enable-gil and disable-gil interpreters can run the same Python codebase without six-type modifications.
  • The pressure to utilize the entire CPU is intense.
    • Any single-threaded slowdown is accompanied by easier multi-core programming, and memory savings.
    • Thus the slowdown shouldn’t appear as apparently arbitrary to python programmers, at least not as did the slowdown from Python 2.7 to Python 3000.
  • The core devs understand the “existing code” problems vastly better than they did before.
    • Breaking changes won’t be made as lightly as before.
    • Many migration difficulties are likely to have much more documented solutions and even PRs 3rd party packages.

I don’t want any of the above to be taken as making light of the risk of splitting the community in 2 vs 3 fashion, or as provoking the core devs to be frivolous.

But I do think there’s some room to consider that the upside potential for a GIL-less python is immense, and to presume that the community will be more motivated to transition more quickly than the in 2 vs 3 era.

Hats off to the core devs for their rigorous blending of innovation and caution! Thank you for Python and increasingly, thank you for Faster Python!!

6 Likes