Type checking for __dict__

Is there any way to keep type checking for methods or attributes dynamically?

I mean keep tracking of __dict__ method dynamically

for example:


class A:
    def method_a(self):
        pass


class B:
    def method_b(self):
        pass


class C:
    def method_c(self):
        pass


class Adaptor:
    def __init__(self, obj, dynamically_changed):
        self.obj = obj
        self.__dict__.update(dynamically_changed)


objs = [A(), B(), C()]
for o in objs:
    if hasattr(o, "method_a") or hasattr(o, "method_b"):
        if hasattr(o, "method_a"):
            dynamically_changed = {"method_c": o.method_a}
        elif hasattr(o, "method_b"):
            dynamically_changed = {"method_c": o.method_b}

        o = Adaptor(o, dynamically_changed)

    print(f"{o}, {o.method_c}"

Obviously the type checker would not know the method_c of object o and then gives a type error.

Workaround like below is ugly:

class Adaptor:
    def __init__(self, obj, dynamically_changed):
        self.obj = obj
        self.__dict__.update(dynamically_changed)

    def method_c(self):
        pass

The clue is the name “static analysis” or “static type checking”. Many of the highly dynamic things Python does cannot be understood by type checkers, because there’s no way to statically infer the type of such a dynamic construct, that includes __dict__. Sometimes you can model this dynamic behavior in a static way and write a mypy plugin for your specific use-case, but it’s often too niche[1] to warrant a change to the type system itself.

What you would usually do in these cases is either make Adaptor a subclass of Any or give it a __getattr__ method that returns Any. Both will have the effect that type checkers will assume that this type can have an arbitrary number of extra attributes that cannot be inferred. That doesn’t help you with language servers, but at least you won’t get as many type errors.

A more common pattern, like a proxy type, has a higher chance of at one point ending up in the type system and I have seen several proposals in the past about it.

In your example you can probably get away with the class instead accepting a Callable and with a combination of ParamSpec, TypeVar and Concatenate you can pass those parameters forward to the dynamic method:

class Adaptor[**P, T]:
    def __init__[S](
        self,
        obj: S,
        method: Callable[Concatenate[S, P]], T]
    ) -> None:
        self.obj = obj
        self._method = method

   def method_c(self, *args: P.args, **kwargs. P.kwargs) -> T:
       return self._method(obj, *args, **kwargs)

  1. or too expensive for static analysis ↩︎

2 Likes

Ok, get it. It comes back to "the more specific the fewer errors ", sadly.

A static __dict__ goes against the dynamic feature of Python language, as a consequence, there is no way to track __dict__, for now.

I supposed to add some compiling option like --static-attr to enable or disable such dynamic feature, may decrease the memory usage as all attributes of an object can be fixed and shared before runtime.

It’s called __slots__ and is documented here: 3. Data model — Python 3.12.2 documentation

You may also find this wiki page helpful: UsingSlots - Python Wiki

Also note that even for __dict__, the attributes are shared between all instances of a class in CPython via a somewhat complex dict optimization described in PEP 412.