PEP 702: Marking deprecations using the type system

I agree that a @deprecated decorator would be useful, but I don’t think deprecation checking should be conflated with type checking. Specifically,

  • I don’t agree with idea that because the analysis is similar to type checking means a type checker is the right place to report deprecation usage.
  • I would be neutral at best to adding this to mypy, leaning towards being against it. If I were maintaining a type checker, I wouldn’t consider this in scope, but I might feel compelled to support it because mypy does, or because users might expecte everything in the typing module to be supported.
  • Tools that don’t do type checking (linters, documentation generators, etc) could make use of such a decorator as well.

Perhaps the decorator could be added to the warnings module instead. A new module (statictools?) might be appropriate if there were more things to include in such a module. To be honest, deprecated seems like a good fit for the built-in scope if conflicting with pre-existing uses of the name weren’t an issue.

3 Likes

Not currently; see the PEP’s rejected ideas section. Support for something like this could be added in a future PEP.

As I mentioned in typing-sig before, I still strongly disagree with this decorator emitting runtime warnings. I can see that many people do want this functionality, but I don’t think a decorator from typing is a way to do it.

Instead, I’d love to see a deprecated decorator from warnings module instead, which handles the runtime behavior. This decorator could then also get the object marked as deprecated typing wise, but typing.deprecated should only affect type-checkers, without any runtime consequences. I simply don’t think that typing is a good place for something with runtime behavior like this.

Also, a decorator without runtime behavior would then allow the users to still create their own decorators, that handle deprecation warnings differently, as some people may want to for example use dates instead of library versions, or include some other information without having to do so in the literal string in that decorator, additionally, it would allow for different ways to show the deprecation warning itself i.e. with a print statement/log message.

Of course, custom decorators could also be made possible by just including a no_runtime argument for the decorator, and that does mostly address custom decorators, however it could be pretty annoying having to specify that arg every time, and it still goes against my general point, of not handling runtime behavior in the typing module.


That said, I actually think there’s an even better solution than just 2 decorators, one in typing, other in warnings, and that is to instead have this PEP add something like typing.Deprecated[X] instead. I’m aware that deprecating constants/variables is currently a rejected idea, however this approach doesn’t just allow that, what it would actually give us is a complete system, that would allow anyone to create custom deprecation decorators, which carry along the typing information, and can handle runtime behavior in any way users would like (there could still be a default implementation in warnings).

This kind of implementation could for example look like this:

T = TypeVar("T", bound=Callable)

def deprecate(fun: T) -> Deprecated[T]:
    def wrapper(*a, **kw) :
        print("Using a deprecated function!")
        return fun(*a, **kw)
    return cast(Deprecated[T], wrapper)

This is generally much more versatile than just the 2 implementations (warnings + typing), and I think many projects could actually benefit from this. There are already some popular libraries that implement deprecation decorators that raise warnings (like Deprecated · PyPI), and we should give these libraries a way for their custom decorators to carry along the typing information about deprecation.

Not only that, this would indeed also add support for deprecating constants easily, with syntax like: MY_CONST: Deprecated[int] = 5. Which I have seen some other people ask about here too, and while currently rejected due to the statistics suggesting that there aren’t many cases of constants getting deprecated in stdlib, even just because of the prior benefits I mentioned this could be worth it, with deprecations of constants being an extra bonus of this.

I would however still suggest doing some more research into how many projects could actually benefit from constant deprecations, outside of just the stdlib. As for me, as an author of a few typed libraries, I know I would benefit from it in various places, and I’m sure many others would too.

3 Likes

I agree there’s a good argument that the decorator belongs in warnings rather than typing, but I don’t see any real reason to have two decorators. A decorator doesn’t have to be in the typing module for it to have special treatment by type checkers. enum.Enum and dataclasses.dataclass are both heavily special-cased by type checkers, and neither is native to the typing module.

6 Likes

Just moving the decorator to warnings would be nice, but it would then require a argument to disable the runtime behavior for people who’d like a custom way to handle this, with their own deprecation decorator. A no_warning argument solves this, but it could be pretty annoying for people with custom decorators to have to specify each time.

I’ve actually edited my message and added a similar note about the argument solution, along with some other remarks.

We already allow turning off the runtime warning: you can set warning_cls=None.

3 Likes

Ah, yeah, then my suggestion would be mainly to move this into warnings instead of keeping it in typing, but I still think that it’s worth considering my suggestion of using the more versatile generic typing.Deprecated. This would be a much nicer approach, as people could implement their own decorator and have it carry the deprecation typing information alongside. It’s much cleaner to then be able to use a single decorator, then having to use 2, and setting warning_cls=None each time.

One disadvantage that I see here is the lack of possibility to specify a message with this approach. We could go the annotated route and do something like typing.Deprecated[X, "Replaced by Y"], but I’m not sure I like this. Alternatively, we could also forgo the message and just leave that for the custom runtime handling, and have the type checker only recognize that something is deprecated, without any custom messages.

But other than this, what’s the reason not to do this? I don’t really see any other advantages of having deprecated be a decorator, it just limits it’s usage to callables, and makes custom implementation really annoying. I’d like to at least see some pros/cons here, explaining why a decorator was chosen instead.

1 Like

A decorator has more obvious semantics. For example, how would you annotate a deprecated class? You might say that you need to deprecate the __init__ or __new__ method, but what if you’re deprecating an Enum, Protocol, or dataclass without such a method? Or what if you’re deprecating a class’s public constructor, but not the class itself?

More broadly, with a deprecated function, it’s the function itself that’s deprecated, not the type. The PEP is quite explicit that any usage of a deprecated function should be flagged (e.g., map(deprecated_func, ...), not just calls.

As for putting the decorator into warnings, I think that would be misleading. The primary purpose of the decorator is to interface with type checkers; the fact that it also (optionally) emits a runtime warning is just a bonus feature.

I feel strongly that a decorator is the clearest way to mark deprecated classes and functions, which appears to be the most common need. I welcome a follow-up PEP for expanding the feature set for deprecations, but I’d like to keep PEP 702 more focused.

1 Like

The original motivation for the PEP was for typing purposes, and the typing features greatly strengthen the benefits that the PEP brings. But now that it emits a warning at runtime (which I think was the correct decision), I think it’s likely also to be useful to many users who have no interest in typing. For those users, I think it could be equally misleading to have it in the typing module, as that could imply that it’s a typing-only feature that’s of no use to people uninterested in Python’s typing features.

(FWIW, I like that it’s a decorator, and that this allows for easy customisation of the error message that type checkers emit. I agree that a decorator has more obvious semantics.)

2 Likes

I already stated this earlier, but Flask etc. probably won’t use this decorator since we need to deprecate other things, like arguments and attributes. We also want precise control over what triggers the warning and other behavior if deprecated stuff is used.

I wish we could find another solution to annotating deprecations that was more flexible, because I do like the general idea of type checking this. Perhaps something similar to the Annotated[type, info] construct? A real annotation has the advantage of no runtime cost at all compared to a decorator.

# The class is deprecated
class Example(typing.Deprecated): ...

# The function or method is deprecated
def example() -> Deprecated[int]: ...

# An argument is deprecated
def example(old: Deprecated[str]): ...

# An attribute is deprecated
class Example:
    old: Deprecated[str]

Perhaps it could take a second message parameter, but I’d also be fine leaving runtime warnings to the library, which can usually target them better than a general annotation/decorator could.

# Maybe an optional message?
def example() -> Deprecated[
    str,
    "The 'example' function will be removed in Library 2.0. Use 'other' instead."
]:
    ...
6 Likes

I may have misunderstood the text of the PEP, but it seems that deprecating parameters is already supported by the proposed decorator:

@overload
@deprecated("old_param is deprecated, use new_param instead")
def f(*, old_param: int, new_param: Optional[str] = None): ...

@overload
def f(*, old_param: None = None, new_param: Optional[str] = None): ...

def f(*, old_param: Optional[int] = None, new_param: Optional[str] = None):
    . . .

The SC is still considering PEP 702, but we’re generally in favour of a standard way of marking deprecations – both for static analysis and for runtime warnings. However, we think having the typing decorator have runtime warning behaviour is misleading, and the runtime behaviour would be better suited to the warnings module. We also understand that it may be desirable to have static analysers emit warnings about deprecated use without having a runtime warning.

Can we have both? A typing.deprecated decorator that’s just for static analysis and has no runtime effect , and a warnings.deprecated decorator that warns at runtime and can be used by static analysis tools?

5 Likes

While I agree that the decorator is better suited to the warnings module, I think having two decorators is confusing. Please note that the current draft already allows turning off the runtime warning by setting category to None.

2 Likes

Yeah, having two decorators is confusing (TOOWTDI). But I’d think that static type checkers are perfectly capable of special-casing things not imported from typing, so having a single decorator in warnings seems fine.

1 Like

Could I get feedback on my alternative annotation idea above? PEP 702: Marking deprecations using the type system - #30 by davidism I’d really like it considered before this one is accepted, and I’d be willing to learn how to put it in PEP form if typing people don’t see any huge issues up front.

1 Like

I originally really wanted this myself too, but this comment by Jellie PEP 702: Marking deprecations using the type system - #28 by Jelle actually made me think about it a bit more, and eventually agree that this PEP should probably be left purely with the decorator. We could potentially extend that later, with another PEP, but marking types deprecated brings quite a few questions that we need answering.

Consider this:

def foo(x: int) -> Deprecated[int]:
    ...

from a function like this, is the function itself deprecated? i.e. should calling the function trigger a deprecated notice by the type-checker, or does it just return a deprecated value? Only the return int value is marked deprecated here, not the actual function. Instead, what you’d want here is probably Deprecated[Callable[[int], int]], which could for example be achieved with some decorator, or by manually casting the function to this type below i.e. foo = cast(foo, Callable[[int], int]). This gets annoying though.

Say you import a deprecated constant, and then use it in various places in your code, where should the type-checker produce warnings? Only in the import statement, or in every place we use it. For example, say we import a deprecated constant: from my_lib import X, this could trigger a warning, but what if we just accessed it without importing: my_lib.X.

What if we have a lot of places where it’s printed/passed around/accessed), do all of these trigger a type error? What about assignments? Will A = X also trigger a type warning, and will the deprecated type carry over to A? Producing a warning everywhere like this would probably just get incredibly annoying, I sometimes need to use deprecated things, and I really don’t want to have to put type: ignore all over my codebase because of it. But when should it be produced then? Perhaps only first time this type is seen? Wouldn’t that carry some edge-cases with it though?

What does it mean for a class to be deprecated? Should all created instances also get marked as deprecated, and any values obtained from such class aswell? (Would say MyClass.FOO class variable also be deprecated?) Or is it only the construction of this class i.e. calling __init__/__new__. If it’s only the construction, what about alternative constructors, such as MyClass.from_foo, which also returns an instance? Should this deprecation propagate, i.e. from_foo constructor calls deprecated __init__ to make that instance, does that mean using this constructor produces deprecation warnings in our code-base, when the constructor itself is from an external library, and wasn’t explicitly marked deprecated?

What does it mean for a function attribute to be deprecated? Sure, this might seem obvious, setting it causes a deprecation type warning. But it’s a bit weird that this attribute’s type is actually Deprecated[int], would that mean we should be passing a value that actually has the type of Deprecated[int] to meet the annotation, perhaps this then skips the deprecation error? Probably not, but it’s still weird that you can pass such a type.

With a decorator, it is very clearly stated, that only calling triggers this type error, so any direct call to a function decorated with the deprecated decorator will produce a type warning. This also works with functions like map(deprecated_f, [1, 2, 3]), and it works well with the runtime behavior of this decorator, i.e. producing an actual warning on each call. Simply, a decorator only allows the function itself to be deprecated, not types. Issue with this is that it’s the types which would be deprecated, not variables/functions/classes themselves. This could then be a problem, and probably carries a bunch of other edge cases that I didn’t think of here.

Here are various thoughts about the PEP, unrelated to the current discussion:

Maybe the interaction with @property should be discussed in the PEP. I’m presuming this is allowed:

class A:
    @property
    @deprecated("deprecated attribute")
    def x(self) -> int:
        return 4

a = A()
print(a.x)  # deprecation warning

but is it also valid to only deprecate the setter? Then type checkers would need to check assignments.

Another interesting case is if a method in a protocol is marked deprecated:

class P(Protocol):
    @deprecated("don't use this", category=None)
    def m(self, x: int) -> str: ...

I guess that could make sense in some cases, but it leads to somewhat weird behavior:

class C:
    def m(self, x: int) -> str:
        return f"x={x}"

c = C()
c.m(0)  # no warning
p: P = c
p.m(0)  # deprecation warning

The PEP also states that this scenario:

A class uses the @override decorator from PEP 698 to assert that its method overrides a base class method, but the base class method is deprecated.

is “relatively unlikely”, but in my experience, libraries often use a mechanism where you have to inherit from a base class and implement certain abstract methods. For example, in PyTorch Lightning, you are supposed to inherit from LightningModule and then implement the hooks you need, like .training_epoch_end(). But in the recent 2.0 release, they replaced many of the hooks with other hooks that work slightly differently, and .training_epoch_end() was replaced with .on_train_epoch_end(). With the deprecation mechanism, they could have warned users ahead of time.

How would you mark a parameter to a function as deprecated via this pep?

Like the

Deprecated[]

typing could sort of do that. Though I’m not sure how a decorator would.

For example:

# How would I say that the biz param is deprecated?
def foo(bar: str, biz: str):
    pass
@overload
@deprecated("biz is deprecated")
def foo(bar: str, biz: str) -> None: ...

@overload
def foo(bar: str) -> None: ...

def foo(bar: str, biz: str) -> None:
    pass
3 Likes

Thanks for the additional input. I’m going to be fairly busy for a few more weeks and then I’ll propose a new version of the PEP. I’ll respond here to a few of the above posts.

Deprecated[…]

@davidism suggests an alternative syntax using a typing.Deprecated special form that can be used as a base class, return annotation, or attribute annotation. I continue to think that a decorator provides a more intuitive interface for deprecating a class or function (previous comment in PEP 702: Marking deprecations using the type system - #28 by Jelle, also cited by @ItsDrike). I would be open to a future PEP to add a Deprecated[] marker for deprecating attributes, constants, and parameters, but this adds enough complexity that I think it’s better saved for a separate PEP. For example, the syntax in @davidism’s post doesn’t allow a deprecation message, but I think it’s important to be able to communicate a precise deprecation message to users. We could make something like Deprecated[<type>, "message"], but type checker authors have expressed concerns about adding more annotation syntax that uses strings for a different purpose than annotations. In summary, if you want this, write a PEP.

Comments from @tmk

@tmk brings up a few cases:

  • Can only the setter of a property be deprecated? I would say yes, and I can add something to the PEP to make that explicit.
  • I don’t see what’s weird about the protocol example. The protocol method is deprecated, so if you use it, you get a warning.
  • The point about @override is convincing. I should change the PEP to say that if a method marked with @override overrides a deprecated method, type checkers should warn.

Where should the decorator live, and should it raise warnings?

Now on to @thomas’s comment on behalf of the SC. The SC thinks that it’s confusing if a decorator in the typing module has runtime behavior, which is the PEP’s current proposal. So that approach is out. Here are some alternatives:

  1. Separate decorators in typing and warnings, as @thomas suggests above. As others pointed out, it feels redundant to have two decorators serving almost the same purpose.

  2. Add only @warnings.deprecated and have type checkers special-case it, also suggested above by @srittau. This feels somehow wrong to me as the decorator isn’t inherently about warnings: warnings are just one mechanism to communicate the deprecation. It feels a little like adding a new enum to the stdlib and putting it in the enum module just because it is an enum. Many usages of the decorator (e.g., in stub files, on overloads, on protocols) will never raise a runtime warning.

  3. Add only @typing.deprecated with no runtime warnings. This was my original proposal, but I changed it because many people wanted the runtime behavior. However, I still think it is an attractive option. The reason we’re putting this in the stdlib is to provide a standard way for type checkers to track and communicate deprecations. There isn’t such a strong reason for standardizing runtime DeprecationWarnings in the stdlib: there are existing third-party packages that do that, and they work fine. The runtime behavior of @deprecated is likely to run into edge cases (e.g., @deprecated breaking __init__ inheritance when used with mixins/multiple inheritance · Issue #251 · python/typing_extensions · GitHub) that I’d rather not have to deal with.

Therefore, I’m inclined to change the PEP to go with option 3 and remove the runtime warning behavior. However, I’d be curious to hear other opinions.

3 Likes