Proposal: `@typing.decorator_transform` to annotate a decorator as having a specific behavior

Added in Python 3.5, the typing.no_type_check_decorator was introduced:

Decorator to give another decorator the no_type_check() effect.

As it’s usage was very limited (and not implemented in mypy), it was recently deprecated.

My proposal is to make a “generic” version of this decorator, to give any other decorator a specific effect, e.g.:

The name decorator_transform is probably not a good one, but I can’t find any better name for now (haven’t think a lot about it). Basically, it would work this way:

@typing.decorator_transform(property)  # Or maybe specifying the argument as a string: @decorator_transform("property")
def my_decorator_behaving_as_a_property(func):
   """This decorator will transform the `func` callable to a `property` if necessary."""

class A:
    def __init__(self) -> None:
        self.a = 1

    @my_decorator_behaving_as_a_property
    def b(self) -> int:
        return self.a + 1

reveal_type(A().b)  # Revealed type is: int

This would help type checkers understand that a specific decorator is behaving as another decorator that requires a special behavior (e.g. with classmethod(), the type checker needs to understand the first argument will be the class object, and not the self instance).

This behavior happens in Pydantic, for example using the computed_field() decorator:

If the computed_field decorator is applied to a bare function (e.g. a function without the @property or @cached_property decorator) it will wrap the function in property itself. Although this is more concise, you will lose IntelliSense in your IDE, and confuse static type checkers, thus explicit use of @property is recommended.

With this new proposal, no need to add an explicit @property to the decorated function.

Pydantic also defines field_validator(), which will transform the decorated method into a classmethod() if necessary.

I don’t know any other library that would benefit from this proposal, but this could be relatively common.

PS: This is a quick idea that I had yesterday, there might be things I haven’t thought about that would make this impossible. There’s also a lot of room for discussion/ideas/improvements on this one.

1 Like

We have this issue at $work as well as in Pantsbuild.

IIRC we do something like:

if TYPE_CHECKING:
    mything = property
else:
    def mything(...): ...

Which really is just a hack (and we’ve seen pylint error intermittently on those lines)

So I’ve taken a deeper look at it, and it seems this could potentially (but it would require a lot of (breaking) changes) be supported natively using type hints.

First using a simple example with cached_property:

from functools import cached_property
from typing import TypeVar

T = TypeVar("T")


def my_cached_property(__func: Callable[..., T]) -> cached_property[T]:
    ...

class A:
    @my_cached_property
    def p(self) -> int:
        return 1

reveal_type(A().p)  # Revealed type is "builtins.int"

Now the issue with property is that it is currently not generic (Make `builtins.property` generic · Issue #985 · python/typing · GitHub / "class property" should be Generic[T] · Issue #4731 · python/typeshed · GitHub).

Quoting Eric Traut:

It would have been nice if property were defined as generic earlier, but making the change now would be very disruptive, and it would be a step backward in some respects if type checkers removed all of their custom logic and error messages specifically for properties.

So maybe this decorator_transform addition could solve this issue: the logic is kept in the type checkers, and people can still “tell” the type checker this logic should be applied to their own decorator function.

I believe the same goes for classmethod, or maybe I couldn’t make it work:

from typing import Callable, Concatenate, ParamSpec, TypeVar

_T = TypeVar("_T")
_R_co = TypeVar("_R_co", covariant=True)
_P = ParamSpec("_P")


def my_clsmethod(__func: Callable[Concatenate[type[_T], _P], _R_co]) -> classmethod[_T, _P, _R_co]:
    ...

class A:
    
    @my_clsmethod
    def f(cls) -> int:
        return 1

reveal_type(A.f())
# main.py:13: error: Argument 1 to "my_clsmethod" has incompatible type "Callable[[A], int]"; expected "Callable[[type[Never]], int]"  [arg-type]
# main.py:13: note: This is likely because "f of A" has named arguments: "cls". Consider marking them positional-only
# main.py:17: error: Argument 2 to "__get__" of "classmethod" has incompatible type "type[A]"; expected "type[Never]"  [arg-type]
# main.py:17: note: Revealed type is "builtins.int"

So if there’s some kind of logic defined in type checkers for properties (as confirmed in the linked GH issues) and classmethods (not sure on this one), maybe this proposal could be beneficial.
Finally, regarding no_type_check, there’s no way to give the information only with type hints (and that’s why the no_type_check_decorator was implemented in the first place).

cc the Pydantic team @adriangb @samuelcolvin, you might be interested by this

This would be helpful for pydantic. To be clear in the case cited the right thing to do currently is use both decorators:

@computed_field
@property

But that’s understandably overly verbose since you always have to use both. At runtime we can essentially apply the @property decorator for you as a convenience if you haven’t already. A couple of lines of code inside of pydantic removes a lot of boilerplate for users, but as per this discussion type checkers don’t like it. If we could mark @computed_field as having @property behavior then it would work both at runtime and for static type checkers.