Type imports, with runtime support

First of all, sorry to open yet another thread about this kind of feature. This has been discussed many times, under different forms:

What I’m proposing here is a continuation of Lazy imports and PEP 649, with a draft specification regarding runtime support.

A bit of motivation…

…to be extended if this ends up as a PEP.

In Pydantic, I often get issues/questions about failing evaluation of annotations for recursive models. Consider the following setup:

See code

model_a.py

from pydantic import BaseModel

class ModelA(BaseModel):
    model_b: ModelB | None = None
    model_c: ModelC | None = None

from model_b import ModelB
from model_c import ModelC

ModelA.model_rebuild()

model_b.py

from pydantic import BaseModel

class ModelB(BaseModel):
    model_a: ModelA | None = None

from model_a import ModelA

ModelB.model_rebuild()

model_c.py:

from pydantic import BaseModel

class ModelC(BaseModel):
    definition: ModelB | None = None

from model_b import ModelB

ModelC.model_rebuild()

When trying to run from model_b import ModelB, you end up with hard-to-debug errors from Pydantic due to some import loop issues [1]. While PEP 649 may actually help in these cases [2], there are still situations where you can only import the necessary symbols in an if TYPE_CHECKING: block, and so runtime type checkers can’t resolve the relevant annotations.

Type imports

A new type import statement is introduced, following the same forms as the existing import statement. For static type checkers, type imports should be treated as being equivalent to normal imports, as long as the imported symbol(s) is/are used in valid locations (that is, where type expressions can be used).

At runtime, the imported symbol using the type import will be a DeferredRef instance, holding information about the imported module and optionally the imported attribute.

DeferredRef’s signature looks like:

class DeferredRef:
    def __init__(self, module_name: str, package: str | None = None, attribute: str | None = None): ...

Here is how type imports would map to a DeferredRef instance (the comments show how DefferedRef gets instantiated):

type import typing  # typing ~ DeferredRef('typing', None, None)


type import typing as tp # tp ~ DeferredRef('typing', None, None)

from typing type import List  # List ~ DeferredRef('typing', None, 'List')

from ..parent type import obj  # obj ~ DeferredRef('..parent', 'current_pkg'  -- that is __package__, 'obj')

type import collections.abc  # collections ~ DeferredRef('collections.abc', None, None)

Because the wild card form of import – from module import * – requires binding all public names, it isn’t supported with type imports.

The DeferredRef class behaves in a special way: it records every operation that was applied on it. To obtain what a DeferredRef actually represents, its resolve() method can be used, which will:

  1. import the module.
  2. Replay the recorded operations.

Here is a Python implementation of the DeferredRef class:

import operator
from importlib import import_module

type Operation = Callable[[Any], Any]

@final
class DeferredRef:
    def __init__(self, module_name: str, package: str | None = None, attribute: str | None = None) -> None:
    self.module_name = module_name
    self.package = package
    self.operations: list[Operation] = []
    if attribute is not None:
        self.operations.append(operator.attrgetter(attribute))

    def _make_new(self, *operations: Operation) -> DeferredRef:
        new = DeferredRef(self.module_name, self.package)
        new.operations = self.operations.copy()
        new.operations.extend(operations)

        return new

    def __getattr__(self, attribute: str) -> DeferredRef:
        return self._make_new(operator.attrgetter(attribute))

    def __getitem__(self, item: Any) -> DeferredRef:
        return self._make_new(operator.itemgetter(item))

    def resolve(self) -> Any:
        resolved = import_module(self.module_name, self.package)
        for op in self.operations:
            resolved = op(resolved)
        return resolved

And this is how it would behave at runtime (the comments show the final repr of the DeferredRef instances):

type import typing
print(typing)  # DeferredRef('typing', None, [])

list_str = typing.List[str]

print(list_str)  # DeferredRef('typing', None, [attrgetter('List'), itemgetter(<class 'str'>)])
print(tist_str.resolve())  #typing.List[str]

# For imports using the `form` clause, we make use of the `attribute` __init__
# parameter, that is then converted into an operation with `attrgetter`:
from typing type import List
print(List)  # DeferredRef('typing', None, [attrgetter('List')])

So far we only implemented support for __getattr__ and __getitem__ operations (the most common ones used in type expressions), but we could extend this to the other operators as well (similar to how PEP 649/749’s stringifier support most expressions).

Runtime evaluation

We could then imagine adding a new resolve_deferred parameter to annotationlib.get_annotations(), that would recursively call resolve() on every encountered DeferredRef instance. Not sure yet how this would play with eval_str and format.

Conclusion

This has the benefit of:

  • avoiding runtime costs of imports for applications (CLIs, small libraries where initial import time matter) that do not care about runtime inspection of annotations.
  • Allow runtime inspection to resolve such imports, in a lazy way.

It is possible that I missed details here that would make this proposal not viable. Feedback welcome, also if you can think of better alternatives regarding runtime support!


  1. Similar issues: pydantic/pydantic#11532, pydantic/pydantic#11250 – as you can see, debugging what’s happening is tedious. ↩︎

  2. We have basic 3.14 support landing soon, but deferred annotations aren’t fully supported yet. ↩︎

5 Likes

I’d much prefer a general lazy import syntax, probably like PEP 690 but with explicit syntax like “lazy import foo”. This is useful in many places outside of typing.

In previous discussions there were some arguments that a type-only import would be helpful to prevent accidentally using the values. I’m not convinced that’s enough reason to add separate syntax.

In this code:

type import typing  # typing ~ DeferredRef('typing', None, None)


type import typing as tp # tp ~ DeferredRef('typing', None, None)

from typing type import List  # List ~ DeferredRef('typing', None, 'List')

from ..parent import obj  # obj ~ DeferredRef('..parent', 'current_pkg'  -- that is __package__, 'obj')

type import collections.abc  # collections ~ DeferredRef('collections.abc', None, None)

from some_mod type import *
# Considering some_mod exports obj1 and ob2:
# obj1 ~ DefferedRef('some_mod', None, 'obj1')
# obj2 ~ DefferedRef('some_mod', None, 'obj2')

I suppose for this example:

from ..parent import obj # ...

You meant to write:

from ..parent type import obj # ...

Thanks updated (and also changed the semantics of star imports, which are actually not possible to support).

Seems like this option was also suggested in the discussion thread after resolution. I’m supportive of the idea, but not knowledgeable enough to pursue the idea.