Proxy and delegation patterns are very useful for adding, selectively altering or restricting access to functionalities of an existing object, and in Python, this is most easily implemented by using __getattr__
to delegate access to missing attributes to the main object. The problem is that __getattr__
is a black box to static type checkers so when a class defines __getattr__
static type checkers would simply treat all attribute accesses as valid:
class Subject:
def method(self):
pass
class Proxy:
def __init__(self, delegate):
self.delegate = delegate
def __getattr__(self, name):
return getattr(self.delegate, name)
Proxy(Subject()).method() # OK
Proxy(Subject()).method(1) # huh? no complaint from static type checkers
Proxy(Subject()).foo() # huh? no complaint from static type checkers
The code above gets complaint from static type checkers only if __getattr__
isn’t defined.
I think there ought to be a declarative way to indicate that any missing attribute of a Proxy
instance is to fallback to that of a Subject
instance, so that it is clear to static type checkers when to complain about missing attributes, and when to assume the delegated object’s typing information to the proxy object (so Proxy(Subject()).method
can get a proper signature).
Possible options include:
- Repurposing the
->
token (pros: most concise):
class Proxy -> Subject:
...
- A new soft keyword (pros: most readable):
class Proxy delegates Subject:
...
- A new decorator (pros: most backwards-compatible):
from typing import delegates
@delegates(Subject)
class Proxy:
...
- A new magic superclass à la
Protocol
, thanks to @jesnie (pros: supports generics):
from typing import Proxy
class MyProxy(Proxy[Subject]):
...
class MyGenericProxy[S](Proxy[S]):
...
Certain runtime functions such as dir()
and inspect.signature
can benefit too.
What does everyone think?