Is the sys recursion limit now bad/obsolete?

Its documentation explains the limit’s purpose:

This limit prevents infinite recursion from causing an overflow of the C stack and crashing Python.

But as What’s New In Python 3.12 says:

The recursion limit now applies only to Python code. Builtin functions do not use the recursion limit, but are protected by a different mechanism that prevents recursion from causing a virtual machine crash.

And indeed in Python 3.12 I can do very deep recursion just fine (Attempt This Online!):

import sys
sys.setrecursionlimit(2**31 - 1)

def f(n):
    if n:
        f(n - 1)

f(10_000_000)
print('ok')

That does get killed if I try depth 20 million, but I think that’s just because of a 2 GB memory limit on that system. A slimmer version can even do 20 million just fine (Attempt This Online!):

import sys
sys.setrecursionlimit(2**31 - 1)

def f():
    global n
    if n:
        n -= 1
        f()

n = 20_000_000
f()
print('ok')

So if Python recursion is only limited by available memory, just like any other things like ints and lists, and if “Builtin functions […] are protected by a different mechanism”, then why does this Python recursion limit still exist?

Your example works fine in Python 3.11 as well. The change in Python 3.12, IIUC, regards which functions contribute to hitting the limit, not how high you can set the limit.

Recursion was always limited by available memory; the recursion limit is intended to kill a runaway recursive function before you exhaust memory (which could impact other processes on your machine), and as soon as it is “obvious” that your recursive function is not going to terminate. (The odds that your code really needs to recursive millions of times is much less likely than you wrote code that was supposed to terminate after a few hundred iterations but has a bug that prevents it from terminating at all. In the rare instance where you really intended to recurse that many times, you can increase the limit and try again.)

2 Likes

But by a different kind of memory, no? The C stack, instead of the heap. Much smaller, and a CPython implementation detail. Not like the heap, which is much larger, and which people know. For example, I know my PC has 32 GB. I bought them and put them in there. And it’s obvious that if I build a billion objects, I’ll likely run out of memory, and that’s due to my amount of data and my amount of memory. On the other hand, I don’t know how large the “C stack” is that CPython uses, and it’s an artificial restriction, not determined by how much memory I have but by a choice CPython made.

My point is: If the limit doesn’t actually do what its documentation says, i.e., it doesn’t prevent “overflow of the C stack and crashing Python” because that’s now done “by a different mechanism”, then what is the Python recursion limit still good for? Like, if I happen to have deep recursion, why not let me? Why artificially restrict me for no apparent reason? We don’t have an equivalent configurable list size limit, either, that “protects” me from building lists too large for my memory, by default waaay before I’d actually run out of memory.

It helps find mistakes sooner. If the deep recursion was not a mistake, the recursion limit can be changed.

And why aren’t lists equally limited? I could likewise argue that that “helps find mistakes sooner”.

But having to do that is inconvenient.

I don’t think that is true. At least not “likewise”.
There are legitimate uses for very large lists.
Among them, re-writing deep recursions, to not use recursion.
The need to hold lots of data in memory, is legitimated by being faster to access compared to other locations.
On the other hand, recursion is not always needed and can always be removed.

1 Like

And likewise there are legitimate uses for very deep recursion.

And recursion can be the natural way to solve a task, with much simpler (or even faster!) code.

For which you can set the recursion limit.

And why should I have to do that?

Would you like it if you had to do that when you want to use long lists?

In my case, I view handicaps not always as a hindrance, but a way to create better (code in this case).
I probably wouldn’t mind. I wouldn’t mind limit on a short limit of number of arguments, limit on length of lines, limit on length of functions, on depth of nested IFs, …

1 Like

See for example the withdrawn PEP 611