PEP 601: Forbid return/break/continue breaking out of finally

The behaviour of return/break/continue within a finally is clear to me, and I believe it is clear to most of Python programmers (at least it is clearly documented). It is not an issue that it complicates the compiler a bit. Is is not an issue that MicroPython can deviate from CPython.

But there is other issue.

try:
    return foo()
finally:
    bar()

is equivalent to

try:
    res = foo()
except:
    bar()
    raise
else:
    bar()
    return res

except without raise considered bad practice in general. There are exceptions: when the exception is saved or transferred to handle it in other place or time, or at least logged. This is why the bare except is not banned. In any case some important exceptions should be checked and reraised.

But if the finally block contains return / break / continue,

try:
    return foo()
finally:
    return bar()

is equivalent to

try:
    res = foo()
except:
    return bar()
    raise
else:
    return bar()
    return res

raise in the except block is never executed. This automatically makes this example a bad practice.

The question is whether we should support a construction which always violates good practices and cannot be used in good program? It silences KeyboardInterrupt, GeneratorExit, MemoryError, RecursionError, CancelledError which can be raise by virtually any code and should not be silently ignored. The only way to fix it is to get rid of finally and use except.

Hmmm, it seems that Discuss mangles replies by email, removing
attribution lines and “>” quotes. That makes the email interface
rather broken.

Agreed, if we were to remove every bit of surprising code from the language, I can imagine it would be stripped bare of features and new features would hardly ever be added. I don’t think there’s any issue with something being initially surprising, as long as the documentation clearly states the behavior and it’s understandable to the majority of readers.

By getting rid of finally do you mean removing it entirely from the language, just that specific usage, or am I misunderstanding something? Removing finally entirely seems like it would be quite an involved deprecation process.

I can’t tell whether you are posing a rhetorical question.

Besides tests, there are only two cases of breaking the finally block in the stdlib.

In Lib/subprocess.py (added in https://bugs.python.org/issue25942):

    def __exit__(self, exc_type, value, traceback):
        if self.stdout:
            self.stdout.close()
        if self.stderr:
            self.stderr.close()
        try:  # Flushing a BufferedWriter may raise an error
            if self.stdin:
                self.stdin.close()
        finally:
            if exc_type == KeyboardInterrupt:
                # https://bugs.python.org/issue25942
                # In the case of a KeyboardInterrupt we assume the SIGINT
                # was also already sent to our child processes.  We can't
                # block indefinitely as that is not user friendly.
                # If we have not already waited a brief amount of time in
                # an interrupted .wait() or .communicate() call, do so here
                # for consistency.
                if self._sigint_wait_secs > 0:
                    try:
                        self._wait(timeout=self._sigint_wait_secs)
                    except TimeoutExpired:
                        pass
                self._sigint_wait_secs = 0  # Note that this has been done.
                return  # resume the KeyboardInterrupt

            # Wait for the process to terminate, to avoid zombies.
            self.wait()

In Lib/multiprocessing/connection.py (added in https://bugs.python.org/issue12328):

        def _recv_bytes(self, maxsize=None):
            if self._got_empty_message:
                self._got_empty_message = False
                return io.BytesIO()
            else:
                bsize = 128 if maxsize is None else min(maxsize, 128)
                try:
                    ov, err = _winapi.ReadFile(self._handle, bsize,
                                                overlapped=True)
                    try:
                        if err == _winapi.ERROR_IO_PENDING:
                            waitres = _winapi.WaitForMultipleObjects(
                                [ov.event], False, INFINITE)
                            assert waitres == WAIT_OBJECT_0
                    except:
                        ov.cancel()
                        raise
                    finally:
                        nread, err = ov.GetOverlappedResult(True)
                        if err == 0:
                            f = io.BytesIO()
                            f.write(ov.getbuffer())
                            return f
                        elif err == _winapi.ERROR_MORE_DATA:
                            return self._get_more_data(ov, maxsize)
                except OSError as e:
                    if e.winerror == _winapi.ERROR_BROKEN_PIPE:
                        raise EOFError
                    else:
                        raise
            raise RuntimeError("shouldn't get here; expected KeyboardInterrupt")

The code is not simple. It silences arbitrary exceptions, this is not obvious, and perhaps was not intended. But it is not easy to say what should be the correct behavior in such cases.

1 Like

I’m mostly responsible for this PEP, and honestly I didn’t think it would lead to such a backlash.

Anyway, as way of background, about a year or so ago I found that MicroPython crashed badly with return/break/continue within a finally. It took me a while to find a fix to the VM which was minimal and didn’t impact code size or RAM usage. Around that time I discussed the problem with Nick Coghlan and we came to the conclusion that the use of return/break/continue within a finally is probably not something you’d want to do anyway, silencing all exceptions.

Since the CPython stdlib used this feature rarely, and the bug in MicroPython was never reported by any user, it was low priority, and eventually got fixed. Then fast forward a year and CPython had a bug handling continue within a finally (so does MicroPython, currently not fixed) and I thought that it might be worth writing a PEP to forbid it.

I didn’t want to mention MicroPython explicitly in the PEP, I wanted the PEP to stand on its own as an “enhancement” of Python regardless of any implementation. But it was the difficulty of implementing return/break/continue that lead me to write the PEP.

In my own opinion, the two strongest reasons I see for adopting this PEP are:

  1. the existing code in the stdlib that uses return within finally is really hard to understand (well, for me anyway, so maybe not such a big point), and therefore hard to maintain;
  2. almost all other languages that support “finally” in some form have come to the general consensus that return should not be used in a finally block (see the PEP for links).

On the other hand, IMO a good reason to reject this PEP would be because it’s better to retain orthogonality of language constructs rather than add a special exception to forbid something (in the compiler, or could be done in the grammar).

I don’t want MicroPython to influence any decision on this PEP. As I mentioned above I thought this PEP was worth writing for the Python language itself. Whatever the outcome here MicroPython will try to follow suit, and if it can’t then it’ll be listed as a difference to CPython.

4 Likes

Reading the references in the PEP it seems to me that most languages implement this kind of construct but have style guides and/or linters that reject it. I would support a proposal to add this to PEP 8 (if it isn’t already there). I also won’t have hard feelings if MicroPython decides not to implement it.

I note that the toy examples are somewhat misleading – the functionality that may be useful is a conditional return (or break etc.) inside a finally block.

For example the return in the __exit__ method in subprocess.py is actually correct, since a return from __exit__ re-raises the exception that it was originally handling, and the only exceptions it could be cancelling would be from if self.stdin: self.stdin.close() or another KeyboardInterrupt. The comment at the top of the try block hints that close() indeed can raise: “Flushing a BufferedWriter may raise an error”. Since we’re already inside error handling (__exit__) it is reasonable to assume that the author fully intended the behavior of cancelling of such errors.

4 Likes

@dpgeorge, @isidentical

The Steering Council took a vote, and the outcome was that 4/4 members present agreed to reject the PEP, for ther reasons I outlined in my previous message. Instead we welcome a PR for PEP 8 that recommends against this usage.

If you want to you can submit a PR to update your PEP with a rejection notice, or I can do it (probably just quoting what I said in my last message).

You’re also more than welcome to submit a PR for PEP 8.

Thanks for writing a PEP!

2 Likes

Thanks @guido for the fast decision on the PEP (even if it didn’t go my way).

I’m happy to make a PR to update PEP601.

I’m also happy to submit a PR to update PEP8 to recommend against the usage. But in that case (and if a linter also warns about this use) wouldn’t the uses in the stdlib (eg in subprocess.py) then be considered bad practice and need to be changed?

3 Likes

No. As PEP 8 itself says, one of the “good reasons to ignore a particular guideline” is:

Because the code in question predates the introduction of the guideline and there is no other reason to be modifying that code.

5 Likes

Looking forward to your PRs. Petr already answered your concern about PEP 8 vs. the stdlib.

1 Like

By “no other reason”, could a good reason be considered improving the overall readability of the section or does it have to involve a more functional modification? This may not apply to adhering to a singular PEP 8 guideline (such as this upcoming one), but I think there’s a significant amount of value that can be gained from improving the readability of a regularly visited portion of the standard library.

Improving the readability can result in users having a better understanding of what is being done within the code, and reduce the long term maintenance cost of enhancing or debugging the section in the future. Of course, this typically is not the largest priority, but gradually improving readability over time can certainly make an impactful difference.

Regarding the PEP 8 “reasons to ignore a guideline”, it also mentions that:

To be consistent with surrounding code that also breaks it (maybe for historic reasons) – although this is also an opportunity to clean up someone else’s mess (in true XP style).

Although rather frequently, it seems to be the case that the process of cleaning up is dismissed in favor of adhering to the preexisting standard. When would be an example of an optimal time to “clean up someone else’s mess” and how is that differentiated from maintaining a historical standard? To me at least, it seems that there’s not always a clear distinction between the two situations.

Note: The above questions are more in the context of CPython PRs, rather than in general.

Please no. That would open the floodgates of well-meaning but potentially breaking PRs. We don’t have enough reviewers to double-check that such improvements are actually correct.

Please leave well enough alone.

2 Likes

I can definitely understand that concern, thank you for the clarification. My primary motivation for becoming a member of the triage team was to help improve the issue regarding the lack of reviewers for PRs and to help the core developers get through them more efficiently. It is most certainly not my intention to worsen that problem.

Apologies for the potentially destructive suggestion, but I’m glad that I brought it up. It’s very useful to be crystal clear on the subject for the purpose of PR reviews, so that I can provide more definitive answers to PR authors. From my impressions thus far in the last several months of reviewing PRs, this seems to be a rather commonly misunderstood area and admittedly one that I’ve not understood myself.

Thanks for understanding. We sometimes encourage people who have expressed genuine interest in contributing to submit a “trivial” PR as an exercise in learning how to go through the PR process (which as you know is quite a bit more involved for Python than it is for the typical GitHub project).

But I’ve also seen quite a few PRs that fix a trivial typo or apply a trivial formatting fix (often with PEP 8 as an excuse[1]) without further contributions by the same author, and I’m concerned that some people use their commit to plump up their resume with a bullet saying “Python Contributor” or something lofty like that, and this needs to be nipped in the bud, as such PRs do consume reviewer resources.

[1] Note that one of the section headings in PEP 8 is “A Foolish Consistency is the Hobgoblin of Little Minds”.

3 Likes

I can certainly see that being an issue. Many of those PRs are at least partially responsible for the current bottleneck.

Unfortunately, I think it’s inevitable that some people will take advantage of the system so they can become an effortless “Python Contributor”. I don’t think there’s an easy way to prevent that entirely without hindering those who genuinely want to learn, we can only somewhat limit the negative impact on reviewer time.

My first PR was a trivial documentation change, so I can certainly empathize with those looking for an introductory PR to gain experience with the workflow. When I had first started, I was entirely unaware of the existence of the “easy” keyword on bpo (“newcomer friendly” hadn’t existed), so I thought that making a minor phrasing improvement would provide a good introduction.

I would like to thank everyone who has made the process of contributing for the first time significantly easier or went out of their way to accommodate new contributors. If it weren’t for those efforts, there’s a good chance that I wouldn’t have started contributing myself.

I’ve always enjoyed that quote used in PEP 8, but I don’t think that I fully appreciated the real-world application of it until now. That conforming to standards exclusively for the sake of consistency ultimately serves no productive benefit, and can even become a detriment when obsessed upon.

A later part of that quote from Emerson, “He may as well concern himself with his shadow on the wall”, could easily be translated to Python as “They may as well concern themselves with changing single quotes to double quotes”.

1 Like

I have made PRs to update PEP 601 and PEP 8, see https://github.com/python/peps/pull/1173 and https://github.com/python/peps/pull/1174 respectively.

2 Likes

Thanks, I approved and merged both.

(OT: There’s never a need to squash commits, we do that when we press the “Squash and Merge” button on GitHub. Squashing is actually counterproductive for the review, as is force-pushing amended commits – it can confuse the reviewer.)

3 Likes

I’ve said something along the same lines a few times in PRs when the authors have asked if they should squash their commits, but it’s useful to know the official stance. Thanks for mentioning it.

FWIW, I’ve now fixed continue within a finally in MicroPython, see https://github.com/micropython/micropython/pull/5154

2 Likes