Change type signature for an inherited method implementation

A situation I find a lot is that I have a method in a superclass that is inherited by many subclasses that should each have more specific return types for the method. I’m trying to establish the best method of annotating this.

A simple example would be:

class Base:
    def generic_method(self) -> Base:
        # implementation here is inherited to all

class Child(Base): pass

class GrandChild(Child): pass

b = Base()
c = Child()
gc = GrandChild()

assert isinstance(b.generic_method(), Base)
assert isinstance(c.generic_method(), Child)
assert isinstance(gc.generic_method(), Child)

Now I want to add annotations somehow to say that Child.generic_method() and GrandChild.generic_method() both return Child rather than Base. I don’t know where to add these annotations because the method is only implemented in Base and is inherited by the other classes. There are other constraints involved that ensure that the appropriate types are returned even though the implementation of generic_method is the same for all classes.

I think if I was using stub files I could just add annotations for Child.generic_method in the stub file without adding any implementation in the actual class but there doesn’t seem to be a way to achieve the same effect with inline annotations.

This example looks almost like a case for Self but that doesn’t work because it is not true that GrandChild.generic_method() always returns a GrandChild.

There is a way that works here which is to add a redundant method in Child:

class Child(Base):
    def generic_method() -> Child:
        """Duplicated docstring."""
        # redundant super call:
        return super().generic_method()

For all the same reasons that you would generally want to use Self when possible this is not great: there are several Child classes and many inherited methods and I don’t want to duplicate all the docstrings and have redundant method calling at runtime. I also don’t want to spread dummy implementations around for a method that should really be defined in one place only.

One thing I have tried which is also not great but at least keeps everything defined in one place is overloading on self in the base class:

from __future__ import annotations

from typing import overload

class Base:

    @overload
    def generic_method(self: Child) -> Child: ...
    @overload
    def generic_method(self: Base) -> Base: ...

    def generic_method(self) -> Base:
        assert False

class Child(Base): pass

class GrandChild(Child): pass

b = Base()
c = Child()
gc = GrandChild()

reveal_type(b.generic_method()) # Base
reveal_type(c.generic_method()) # Child
reveal_type(gc.generic_method()) # Child

This is accepted by pyright although mypy reports an error:

The erased type of self “Child” is not a supertype of its class “Base”

It seems that mypy doesn’t like overloading on the type of self. although it does still infer the types correctly as reported by reveal_type which is the main thing I want.

Are there any other/better ways of doing this?

PEP 696 has a way:

(playgrounds: Pyright)

from typing import Any

class Base[T: Base[Any] = Base[Any]]:
    def generic_method(self) -> T: ...

class Child(Base['Child']): pass

class GrandChild(Child): pass
b = Base()
c = Child()
gc = GrandChild()

reveal_type(b.generic_method())  # Base[Any]
reveal_type(c.generic_method())  # Child
reveal_type(gc.generic_method())  # Child

Mypy doesn’t support that yet, though, so you’ll have to make do:

(playgrounds: Pyright, Mypy)

from typing import Any, Generic
from typing_extensions import TypeVar

T = TypeVar('T', bound = 'Base[Any]')

class Base(Generic[T]):
    def generic_method(self) -> T: ...

class Child(Base['Child']): pass

class GrandChild(Child): pass
b = Base[Base[Any]]()
c = Child()
gc = GrandChild()

reveal_type(b.generic_method())  # Base[Any]
reveal_type(c.generic_method())  # Child
reveal_type(gc.generic_method())  # Child
1 Like

You can do that in regular files too! You just need to guard the declaration with a if TYPE_CHECKING: so that it won’t override the implementation at runtime. (TYPE_CHECKING is a constant imported from typing that type checkers assume to be true, but actually is always false at runtime. So your type checker will see the declaration, but it won’t ever be executed.) E.g.:

class Base:
    def generic_method(self) -> Base:
        # implementation here is inherited to all

class Child(Base):
    if TYPE_CHECKING:
        def generic_method(self) -> Child: ...

class GrandChild(Child):
    pass
1 Like

Thanks. I have considered this sort of thing and there are many ways in which it would make sense to use type parameters here. This is potentially what the proper solution will be in the end but I’m just not sure yet. I didn’t consider using Any here so maybe that can make it work better because otherwise it seemed like we needed to have recursive type variables like:

T = TypeVar('T', bound='Base[T]')

In context we would need at least two interdependent type parameters like:

from typing import TypeVar, Generic

T1 = TypeVar('T1', bound="Base[T1, T2]")
T2 = TypeVar('T2', bound="Base[T1, T2]")

class Base(Generic[T1, T2]):
   ...

class Child(Base[T1, 'Child']): pass

class GrandChild(Child[Child]): pass

I’m very reluctant right now to introduce type parameters at the base of this class hierarchy because there are a thousand subclasses plus more in downstream codebases as well. I suspect that for compatibility it would be a lot easier to introduce type parameters now than it would be to remove/change them later. I think we would need to be very sure that we had the right type design before committing to any type parameters and we are currently a long way from being able to be sure about that.

Perfect, I didn’t think of this. This does exactly what I wanted and is what I will use for now.

I’m not particularly concerned about this myself but out of interest how does this work from a “checking” perspective?

It seems that this technique can bypass the checker:

from __future__ import annotations

from typing import TYPE_CHECKING

class Base:
    def generic_method(self) -> Base:
        return Base()

class Child(Base):
    if TYPE_CHECKING:
        def generic_method(self) -> Child: ...

    def child_method(self):
        ...

class GrandChild(Child):
    pass

# type checks fine but fails at runtime:
Child().generic_method().child_method()

That is unless someone just sets it to true at runtime which some people apparently do.

Yeah, you certainly can blow stuff up by doing this and the type checker won’t help you with it. The philosophy basically is that Python is such a highly dynamic language that originally was never meant to have type checking that there just is far too much stuff to cover with a type checker. A fully typed program doesn’t give you the absolute guarantee that it won’t fail at runtime, it just helps you write code that avoids common mistakes.

In particular, the annotation on the Child class is basically us telling the type checker to really trust us that generic_method will only return a Child when called from it. It will take us at face value without verifying that and only check that downstream consequences of that are correct. This kinda behaviour can be great in cases like this where we do want to explicitly make additional assertions that the type checker doesn’t (or can’t) reason out itself, but it also has the risk of us making a mistake.

1 Like

That is exactly what I want in this case. In many similar situations I find I need to use type:ignore so I was just a bit surprised by this one. If I call super explicitly then type: ignore is needed:

class Base:
    def generic_method(self) -> Base:
        assert False

class Child(Base):
    def generic_method(self) -> Child:
        return super().generic_method() # type: ignore

In context I just need the checker to understand what is actually returned here because these methods are called in thousands of other places so nothing else can be inferred or checked properly unless the return type here is understood.

I suppose in principle a type checker could check the code from the inherited method as if it was inserted here. There could maybe be a decorator for this like:

from typing import inherits

class Child(Base):
    @inherits(Base)
    def generic_method(self) -> Child:
        ...

Then at runtime the decorator could remove the method but a checker could understand that this inherits the implementation but with a different type signature and then check that as if the inherited function body was inserted here.