I’m planning to write a new chapter for the typing spec that focuses on subtyping rules for callable types. This is an area that has always been underspecified in the spec and the typing PEPs.
One topic that I’d like to get feedback on before writing the chapter has to do with the subtyping rules for ...
. The ellipsis token can be used in the Callable
special form. It can also be used to specialize a generic class or type alias parameterized with a ParamSpec
. And it can be used with or without a Concatenate
. This is all well documented and specified.
The meaning of ...
is not precisely specified in the typing spec, but all type checkers appear to treat it as a gradual type form. That is, it’s analogous to the Any
type, but it applies to callable signatures (or partial signatures, if used with concatenation). As a gradual type, it implies bidirectional type consistency with any callable signature (or partial signature).
For example, the types of all of the following functions are bidirectionally compatible with the type Callable[Concatenate[int, ...], None]
.
def func1(x: int, /) -> None: ...
def func2(x: int, /, y: int) -> None: ...
def func3(x: int, /, *args: int) -> None: ...
def func4[**P](x: int, /, *args: P.args, **kwargs: P.kwargs) -> None: ...
f: Callable[Concatenate[int, ...], None]
f = func1 # OK
f = func2 # OK
f = func3 # OK
f = func4 # OK
So far, so good. All of the major type checkers agree on the above.
The question I have is whether ...
can be specified in any other manner. If you’re defining a callback protocol, is there a way to define the __call__
method in a def
statement such that the semantics are the same as if you had used ...
in a Callable
annotation?
Mypy appears to apply an undocumented rule (heuristic?) here. If the __call__
signature includes both a *args: Any
and a **kwargs: Any
parameter with no intervening parameters, then mypy treats the signature (or partial signature) as though it’s a ...
. If the *args
or **kwargs
parameters have any other type annotation (such as object
) or have any intervening parameters, then mypy doesn’t apply this rule. It also applies this rule in cases where the *args
and **kwargs
parameters are generic but are specialized to Any
.
T = TypeVar("T", contravariant=True)
class Proto1(Protocol):
# Mypy evaluates (x: int, ...)
# Pyright evaluates (x: int, *args: Any, **kwargs: Any)
def __call__(self, x: int, *args: Any, **kwargs: Any) -> None: ...
class Proto2(Protocol[T]):
def __call__(self, x: int, *args: T, **kwargs: T) -> None: ...
class Proto3(Protocol):
def __call__(self, x: int, *args: object, **kwargs: object) -> None: ...
class Proto4(Protocol):
def __call__(self, x: int, *args: Any, y: int, **kwargs: Any) -> None: ...
class Concrete:
def __call__(self, x: int, *, y: int) -> None:
pass
f1: Proto1 = Concrete() # OK (mypy), Error (pyright)
f2: Proto2[Any] = Concrete() # OK (mypy), Error (pyright)
f3: Proto3 = Concrete() # Error (mypy and pyright)
f4: Proto4 = Concrete() # Error (mypy and pyright)
Currently, pyright does not apply any special-case rules because there is no such provision anywhere in the typing spec or in any PEPs, at least to my knowledge. If you want to use ...
in a protocol in pyright, you’d need to use a ParamSpec
and explicitly specialize it.
class Proto5[**P](Protocol):
def __call__(self, x: int, *args: P.args, **kwargs: P.kwargs) -> None: ...
f: Proto5[...] = Concrete() # OK (mypy and pyright)
Is mypy’s special-cased behavior here defensible and desirable? What about pyright’s behavior? I can make an argument for either, but I think it’s important that we choose one and document it. I find mypy’s special-casing here to be surprising and unintuitive, but I can understand why it might have been added at some point.
If we adopt some variant of mypy’s behavior, then I question whether TypeVars specialized with Any
should be in-bounds for the rule. I also question whether intervening keyword-only parameters should affect the rule. I’m guessing these were unintended behaviors in mypy’s implementation.