Potential Issue: There appears to be a conflict between what the Python typing library docs state and what the mypy docs state regarding function overload usage.
The Python typing library docs on function overloads state: “The non-@overload-decorated definition, meanwhile, will be used at runtime but should be ignored by a type checker.” In the provided example, the non-overloaded function has no type annotations.
The mypy docs have a similar example; however, in that example the non-overloaded function has type annotations.
Is this an issue?
Background:
This came up as I was trying to figure out how to use type annotations for some code. My definition of success is being able to run mypy (v1.14.0) in strict mode with no complaints. For the code and discussion, please see the following StackOverflow post.
Brief example:
from typing import Literal, NotRequired, TypedDict, overload
class Movie(TypedDict):
Title: str
Runtime: str
Awards: str
class MovieData(TypedDict):
Awards: NotRequired[str]
Runtime: NotRequired[str]
@overload
def get_movie_field(movies: list[Movie], field: Literal['Awards']) -> dict[str, MovieData]:
...
@overload
def get_movie_field(movies: list[Movie], field: Literal['Runtime']) -> dict[str, MovieData]:
...
def get_movie_field(movies, field):
return {movie['Title']: {field: movie[field]} for movie in movies}
# Check with mypy v1.14.0:
PS> mypy --strict program.py
program.py:18: error: Function is missing a type annotation [no-untyped-def]
Found 1 error in 1 file (checked 1 source file)
Someone brought up exactly the same thing fairly recently, although I couldn’t find it.
I agree that the docs’s wording is confusing, but there’s two important things to note:
Function overloads and how they’re supposed to work haven’t been fully spec’d yet. But it’s one of the things being actively worked on right now.
This sentence only really applies to the type of the function. The annotations on the implementation are still important for checking the body of the function however.
In the mypy example, the args of both overloads are all ints or absent kwargs with defaults, so the union of all the types in all the overload definitions, is still an int, or an int | None (the default in this case) respectively.
But in general, given enough overload definitions, that union could be Any.
The typing docs part is a work in progress, yes. You can go by what the mypy docs say for now.
Essentially the overloaded signatures are what’s relevant for callees (i.e. the users of your function) and the implementation signature is what’s relevant for checking the body of the function.
mypy will additionally check if the implementation’s signature covers all the types possible through the overloads, for additional safety. However this will sometimes lead to Any as the only valid option for a parameter annotation in the implementation, since you can’t always express the complex type relation defined through the overloads using a type annotation.
In most cases a simple Union will do the trick however.
The typing docs could be improved here. The statement " The non-@overload-decorated definition, meanwhile, will be used at runtime but should be ignored by a type checker." is correct as far as calls to the overloaded function are concerned: those don’t look at the implementation. But type checkers will still type check the implementation function, so it’s not quite right to say that it is ignored.
From what I can tell both pyright and mypy they do three essentially seperate things. When determining the type of the function, they use the @overload decorated definitions and ignore the implementation. When checking the body of the implementation they use its own head and ignore the @overload decorated ones. And there’s a compatibility check making sure that the argument types of each @overload definition is assingable to the corresponding implementation arg, and the same for the return type.
The typing spec is trying to specify the first thing, but is using language that implies that is all that is done. It seems best if the change doesn’t just make it clear that other stuff can also happen, but that it’s those two things and how they should be done.
On that note, should that compatibility check actually just assert that the return type of each @overload is assingable to the implementation? Both pyright and mypy currently do that and thus allow something like
But it’s clear from just the function headers that the cases where the return value isn’t an int or a str isn’t covered by any overload. I’d expect that the return type of the implemention would have to be consistent with the union of the @overload return types (i.e. exactly the union modulo gradual types). Are there cases where we want the implementation to return values that aren’t actually included in any overload? If the only reason is ergonomics then Any would be a better choice imo.