Option to ignore default implementation of `@singledispatch` type-wise (Pyright)?

Hello, an example comes best at hand:

from functools import singledispatch

@singledispatch
def d(arg):
    raise NotImplementedError()

@d.register
def d_int(arg: int) -> int:
    return arg + 1

@d.register
def d_str(arg: str) -> str:
    return "Hello " + arg

res = d("foo") # type:Unknown
res2 = d(42) # type:Unknown
res3 = d({"foo": "bar"}) # type:Unknown; does fail at runtime only

Status quo:

  • res, res2 are untyped.
  • res3 fails only at run-time.

Desired:

  • res should have type str, res2 should have type int.
  • d({"foo": "bar"}) should fail at compile-time.

Notes
For benefit of stricter types and explicit configuration I’d like to omit default implementation of d type-wise. Putting # pyright: strict at the top of the file unfortunately has no effect. Using @overload would be an alternative that errors on compile-time, but using @singledispatch has the charme that you don’t need to manually dispatch function arguments within the body (besto of worlds imo).

For completeness sake, here is the @overload alternative:

from typing import overload

@overload
def o(arg: str) -> str: ...
@overload
def o(arg: int) -> int: ...
def o(arg: str | int) -> str | int:
    if isinstance(arg,str):
        return "Hello " + arg
    else:
        return arg + 1


res21 = o("foo") # str
res22 = o(42) # int
res23 = o({"foo": "bar"}) # Error: No overloads for "o" match the provided arguments  (good!)

Proposal

  • Explicit diagnostic setting/option in Pyright - no change in decorator core logic
  • Addition of decorator argument @singledispatch(use_default=False) or similar - change of decorator and Pyright
  • … (?)

Any ideas are appreciated, thanks.

1 Like

The most important thing here is that the Python typing system does not ever cause anything to fail “at compile time”. If your third-party tooling flags something and refuses to pass the code on to Python, that’s a different matter. But “Python should report a TypeError at compile time” is a non-starter: only SyntaxErrors are reported at compile time. And “my third-party tooling should flag this” is an issue with that tooling, not with Python.

No, it wouldn’t. Again, the “compile time” you have in mind isn’t any such thing. Nothing prevents you from inputting completely type-incorrect code at the REPL with typing.overload annotations, and getting the exact same result you would without those annotations (and not getting any up-front error):

>>> from typing import overload
>>> @overload
... def d(arg: None) -> None: ...
... 
>>> @overload
... def d(arg: str) -> str: ...
... 
>>> def d(arg): return 3 # , sire.
... 
>>> d(None)
3
>>> d('spam')
3
>>> d(5) # this is RIGHT OUT.
3
>>> d({'parrot': 'dead'})
3

… Or, for that matter, putting such code in a Python source file, and just… not using Pyright or whatever.

functools.singledispatch is different in that it actually decorates the function, which entails creating a wrapper that actually checks the type - at runtime, because that’s the only “time” that a generated code wrapper could possibly do anything.

Hi @kknechtel , thanks for your comment.

Yes I am aware of the fact, should have formulated better. Replace compile-time by tooling-time - this is why there is specific mentioning of Pyright. For clarification: I recognized this forum as one of the sources for discussions around Pyright and interplay with Python type system, project owners seem quite active as well. If it turns out to be the wrong place for the topic, I’ll mark post as answered.

Let me try once again: What could a possible solution look like, that facilitates or enables external tooling support (static type checkers, linters, LSP) being able to do exhaustiveness type checks for @singledispatch decorated functions - emitting type errors given default implementation branch has been taken (looking at static types)?

I had two suggestions - another one would be to add Never/NoReturn to default function implementation signature similar to assert_never, so that type checkers might have a better chance to recognize this pattern.

From how I perceive your answer, you seem to be a strong proponent of making this an exclusive setting of each external tool respectively, not providing additional context help from Python core to facilitate their work. (For latter case, I described a decorator argument use_default=False just as an example.) Let me also state, it is quite ugly to have the need for a default implementation raise NotImplementedError(), which might speak more in favor of adjustments within Python.

General idea is to get simple, basic support for type hints of algebraic data types/pattern matching with functions and methods.