Deferred annotations and generic base classes

I thought Python 3.14 and its deferred annotations would let us avoid forward refs when inheriting from a generic type, but it looks like it’s not the case, and I can’t find any relevant information in PEP 649:

# generic_base.py 
class Foo[T]:
    ...

# Works fine for a type annotation.
foobar: Foo[Bar] = Foo()

# But not when inheriting from the class.
class FooBar(Foo[Bar]):
    ...

class Bar:
    ...
% python3.14 generic_base.py
Traceback (most recent call last):
  File "/home/pawamoy/generic_base.py", line 8, in <module>
    class FooBar(Foo[Bar]):
                     ^^^
NameError: name 'Bar' is not defined
% python3.14 -V             
Python 3.14.0a2

Looking at the code, it seems obvious, because in the base class case, Foo[Bar] happens at runtime. But the [Bar] part really is typing information, so should it really affect runtime?

Lets take a look at dataclass-like transforms, for example. With new deferred annotations, these libraries will be expected to try and use inspect.get_annotations with various formats: Format.STRING, Format.FORWARDREF and Format.VALUE. The last one will fail with name errors.

from inspect import get_annotations, Format

class Foo[T]:
    ...

class A:
    foobar: Foo[Bar] = Foo()

print(get_annotations(A, format=Format.STRING))
print(get_annotations(A, format=Format.FORWARDREF))
print(get_annotations(A, format=Format.VALUE))

class Bar:
    ...
% python3.14 generic_base.py
{'foobar': 'Foo[Bar]'}
{'foobar': __main__.Foo[ForwardRef('Bar')]}
Traceback (most recent call last):
  File "/home/pawamoy/generic_base.py", line 12, in <module>
    print(get_annotations(A, format=Format.VALUE))
          ~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/pawamoy/.basher-packages/pyenv/pyenv/versions/3.14.0a2/lib/python3.14/annotationlib.py", line 704, in get_annotations
    ann = _get_dunder_annotations(obj)
  File "/home/pawamoy/.basher-packages/pyenv/pyenv/versions/3.14.0a2/lib/python3.14/annotationlib.py", line 837, in _get_dunder_annotations
    ann = _BASE_GET_ANNOTATIONS(obj)
  File "/home/pawamoy/generic_base.py", line 8, in __annotate__
    foobar: Foo[Bar] = Foo()
                ^^^
NameError: name 'Bar' is not defined

It seems unfair to me that class FooBar(Foo[Bar]) triggers the same error as if inspect.get_annotations(..., format=Format.VALUE) had been used. It feels… too early? If the [Bar] part in inheriting from Foo[Bar] is only meant for type-checking and dataclass-like transforms, why not give dataclass-like libraries a chance to handle this like they handle attributes? Worst case, Bar must indeed be declared before the transform is applied, that’s a user error and the user can change its code. Best case, Bar is not needed at all to create the FooBar class. It’s only needed later when type-checking things.

Is it simply the price to pay for using the subscript syntax? The following should definitely be evaluated directly and not deferred, and I suppose the interpreter cannot know the difference?

class Bar:
    ...


classes = {"bar": Bar}
var = "bar"

class Foo(classes[var]):
    ...
1 Like

That’s what it comes down to, yes. PEP 649 doesn’t change anything about runtime subscripts, the compiler can’t know in most cases whether a subscript is intended for typing or runtime. It changes how __annotations__ is generated, which only affects when annotations are evaluated. So it should be similar to your experience with from __future__ import annotations in earlier Python versions, except you also get more powerful inspect.get_annotations and typing.get_type_hints functions.

1 Like

Thanks @Daverball!

@bswck suggested an alternative on Discord:

class Foo[T]:
    ...


class FooBar[T=Bar](Foo[T]):
    ...


class Bar:
    ...

That doesn’t mean the same thing, since FooBar will still be generic, it just now has an upper bound of Bar on T.

But yes, PEP 695 syntax doesn’t suffer from the same drawbacks. Since the brackets in that syntax aren’t considered a subscript like in an arbitrary expression, but are instead considered part of the class definition statement. You can also see this when you look at the AST produced by this code, the type parameters will be available directly on the ClassDef node. The Foo[T] part in the parent class list is still just a regular subscript though.

2 Likes

The type statement can also be used to create a forward reference to a type:

type BarT = Bar
class Foo[T]:
    pass
class FooBar(Foo[BarT]):
    pass
class Bar:
    pass

which IMO is fine

2 Likes

Sure, although at that point you may as well save yourself the extra statement and name and just use a string forward reference:

class Foo[T]:
    ...

class FooBar(Foo["Bar"]):
    ...

class Bar:
    ...

You do get easier runtime introspection if you use a type statement, so there are some advantages, but I personally wouldn’t bother, unless I knew I was going to take advantage of it in some way.

Bar is not defined yet when used in foobar: Foo[Bar] = Foo() and class FooBar(Foo[Bar]). This is unrelated to anything but global scope name resolution.

Define Bar before using it:

# generic_base.py 
class Foo[T]:
    ...


class Bar:
    ...


# Works fine for a type annotation.
foobar: Foo[Bar] = Foo()


# But not when inheriting from the class.
class FooBar(Foo[Bar]):
    ...
1 Like

For the context, I got an issue in mkdocstrings where the user shown that the FileWatcher.Event class in Receiver[] (see snippet below) wasn’t rendered as a link, but as a string. Indeed, they were using a forward reference, and understanding that this is a forward reference through static analysis is not trivial. So I was searching for an example to add to the docs where a forward ref was not necessary.

class Receiver[T]:
    ...

class FileWatcher(Receiver["FileWatcher.Event"]):
    class Event:
        ...

Of course that’s the obvious thing to do, however you can see in the case above that it was not possible, or not as straight-forward :slightly_smiling_face:

1 Like

For nested classes I do actually prefer using type aliases, so in that example I would go with @alwaysmpe’s suggestion and use a type statement:

class Receiver[T]:
    ...

type FileWatcherEvent = FileWatcher.Event

class FileWatcher(Receiver[FileWatcherEvent]):
    class Event:
        ...

In most of the other cases I would try to restructure the code, so the forward reference isn’t necessary or live with the missing cross-reference in the docs. But I’m also not sure if mkdocstrings would cross reference the type alias, or FileWatcher.Event in this case. So it might not actually be a viable solution for that specific use-case, unless you decide to ellide simple type aliases and cross reference the symbol they’re aliasing.

2 Likes

Yes, I like the type approach too! I’ll add it to the docs.

In this case, FileWatcherEvent (in the base class) would point to the type variable, and the type variable itself would display its value, FileWatcher.Event, then pointing to the actual class. Well, that’s the plan, we don’t yet support type declarations :see_no_evil: We’ll need a new “type” kind of object, alongside modules, classes, functions and attributes. This is exciting :smile:

2 Likes

If you’re documenting stuff I’ll add a caveat.

A TypeAliasType created using the type statement can’t currently be unpacked at runtime, I’ve raised the issue in the cpython repo and it should work in future (I’ve volunteered to try to fix it), but currently the below errors at runtime:

type BarTuple = tuple[int, str]
class Foo[*T]:
    ...
class Bar(Foo[*BarTuple]):
    ...

but the (deprecated) TypeAlias type works instead

from typing import TypeAlias
BarTuple: TypeAlias = tuple[int, str]
class Foo[*T]:
    ...
class Bar(Foo[*BarTuple]):
    ...

Depends how complicated your type arguments are. At the top of my current code file:

type _ReComposableVT = str | ReComposable
_ReComposableT: TypeAlias = tuple[_ReComposableVT, ...]
type _ReOrVT = ReExpr | ReTokenExpr | str | ch_set | ch_xset | or_seq
type _ReOrT = tuple[_ReOrVT, _ReOrVT, *tuple[_ReOrVT, ...]]
1 Like

Oh, for sure. But at that point you would do it regardless of whether you needed a forward reference or not, just to help with readability, since the name of the type alias now becomes documentation for what this complex type expression is supposed to encapsulate, and it avoids repetition.

My point was, that I wouldn’t usually define one just for the sake of a single forward reference, since it adds unnecessary indirection and as such even hurts readability to a small degree and doesn’t help with repetition.

1 Like

Thanks! From the perspective of dynamic analysis for documentation purposes, inspecting its __value__ attribute will be enough and I won’t need it to support unpacking :slightly_smiling_face: But that’s good to know!

1 Like