PEP 702: Marking deprecations using the type system

Thanks. I get what you’re saying that Deprecated[] can always be added later, it shouldn’t matter if @deprecated also exists. I’ll see how this turns out for now.

4 Likes

Option 3 seems like the worst of both worlds to me: We don’t get runtime warnings, which was what many people wanted, and for which there were many good reasons in previous discussions, while still having the problem from option 1, if we decide to add a decorator that issues runtime warnings later.

Edit: And while I still think that warnings is the best place for the decorator, because - to me - the runtime behavior is actually the “leading” behavior (even if this originally started as a typing proposal), in the end the module it ends up in is bikeshedding territory. I would be fine with the decorator ending up in typing as well.

3 Likes

I think Option 3 is good. I’ll still be writing all my own warnings because I want to use the same code for all types of deprecations, not only the ones this decorator can mark. The one thing I can’t get right now is type checker support, so that’s all I want the decorator to do.

It seems that there’s almost an even split of people advocating for runtime behavior, and people who are against it. Because of that, it might be wise to try and come up with a solution that would satisfy both camps.

To do that, rather than proposing just a direct decorator for deprecation of a function, the PEP could add a “meta” decorator, which would be used to decorate a decorator, which would then itself contain the actual runtime logic.

This PEP could then propose that one such decorator should be warnings.deprecated, but other custom ones can be created as well, where people can handle runtime behavior in custom ways.

By doing this, we would allow for any custom decorators to still carry the typing information about the deprecation itself.


The thing is, adding a standard way to deprecate something, which produces warnings on runtime should in my mind be a bit more complex than just passing the message. I like having the message constructed automatically, based on the function’s name, and various passed attributes, such as say replacement="new_func", removal_version="5.2.3".

I also need some extra behavior which isn’t as simple as altering a message. Something that I do in some of my libraries is to specify a version at which the decorated function should be considered deprecated in the decorator, and on runtime, raise warnings only as long as the library is below this version, once the deprecation version is reached, I instead raise a DeprecationWarning exception.


Allowing us to construct our own decorators, while also providing some sane default implementation seems like a good option, that should mostly leave everyone happy.

That said, there is one, potentially pretty big issue with something like this: The deprecation message would probably be impossible for the type checker to pick up on. With a custom deprecation decorator which just takes a bunch of kwargs, the type checker would have no chance of being able to understand that.

To me, I think it’s much more important to allow any custom runtime behavior, even if it’s in favor of being able to have the type checker pick up on some custom deprecation message, especially when that message needs to be a literal.

Just knowing something is deprecated would still be very helpful, even without a message. For more info, people can simply look at the changelog / peek the definition and look at the actual source code, which could include that info / actually run the code, and see the deprecation warning that got produced.

We could have some hacky bypass for this, such as checking if there’s a kwarg passed to the custom decorators called type_checker_message, and having the type checker show that, but I don’t think that would be a great solution

2 Likes

I agree with @srittau: I like option (2) best.

It’s true that adding runtime behaviour makes the feature more complicated ot get right. But I think it will be surprising and disappointing for users to find out that it doesn’t have runtime behaviour. For users who don’t want the runtime behaviour, or who hit unfortunate edge cases with the runtime behaviour, it’s easy to turn it off by passing category=None to the decorator. They can then either use their own decorator as well as @deprecated, or add runtime deprecation warnings manually using warnings.warn().

Or adding a new context manager to the stdlib and putting it in contextlib just because it’s a context manager. Which is basically what we did with contextlib.chdir :wink:

I again agree with @srittau that where the decorator lives probably isn’t the most important question, but warnings seems to me to be as good a place as any, since that’s where all the other stdlib machinery for handling deprecations lives.

2 Likes

Is @overload described in the pep? I mean its mentioned, but I don’t think its defined. Is it similar to: overloading · PyPI. I think maybe it should be more defined in the pep unless I’m missing something.

Edit: Always more for me to learn. Found: typing — Support for type hints — Python 3.11.4 documentation

Surprising that google didn’t find that when searching: “@overload” python.

typing.overload is an already existing feature, it is not proposed as something new introduced here. See the docs for more info on it. Originally, overload was introduced along with a bunch of other things in PEP 484

I just realized that although I had shared private feedback, it would most probably be useful to share it publicly here and show my support.

I don’t have a strong opinion about the debated points, I just wanted to say that I really like this.

I’m already benefiting from it. Pydantic already started using it, and VS Code already supports it, I already got instant feedback when using something deprecated (a strikethrough in the editor UI) and I was able to detect it and correct it right there. Orders of magnitude faster than with previous mechanisms, and with a great developer experience.

I already started using it in FastAPI and will continue to do so, as well as for Typer, SQLModel, and Asyncer. And it will also power the documentation API reference (to be released soon).

3 Likes

Are there any plans to allow @deprecated to be usable without brackets or as a plain function?

@deprecated
def foo(...): ...

deprecated(foo)

In many cases, a simple default message like “function ‘foo’ is deprecated” could be sufficient, reducing necessary boilerplate. All the following can be supported simultaneously:

@deprecated  # "bare" mode, default message
def foo(): ...

@deprecated(stacklevel=2)  # stacklevel given, default message
def foo(): ...

@deprecated("custom message")  # custom message
def foo(): ...

deprecated(some_callable)  # used as a plain function
deprecated(some_callable, "custom message")
deprecated(some_callable, stacklevel=2)

In particular, it seems the current implementation in typing_extensions does not allow non-decorator usage, but this is important for instance when one wants to deprecate a function that is set dynamically (cls.method = deprecated(method)).

This can be easily achieved using the func=None trick:

def deprecated(func=None, msg=None, /, *, category=DeprecationWarning, stacklevel=1):
    """Indicate that a class, function or overload is deprecated."""
    if isinstance(func, str):
        # used as deprecated("message") -> shift arguments
        assert msg is None
        msg = func
        func = None

    if func is None:
        # used with brackets -> decorator factory
        def decorator(decorated):
            msg = make_default_message(decorated) if msg is None else msg
            def wrapped(*args, **kwargs):
                ...
            return wrapped
        return decorator

    # used without brackets -> wrap func
    msg = make_default_message(func)
    def wrapped(*args, **kwargs):
        ...
    return wrapped

Caveat: this would fail if someone used it on a callable subclass of str!

You can write cls.method = deprecated("Use other_method instead")(method) for this.

I’m open to allowing @deprecated without parentheses if there is a widespread request for support, but I would prefer to leave it out. This change would complicate the implementation (and the type of the decorator itself!) and I like that the current specification forces people to provide a deprecation message.

3 Likes

One problem is that of referencing the name of the deprecated object. For example, say one wants to emit the message "Method {method_name} of class {class_name} is deprecated."

The advantage of the default message is that it has access to func.__name__, func.__qualname__ and func.__self__, and can template these values. Otherwise, one has to add them manually, which is susceptible to typos and missed refactorization changes.

Another issue with the current PEP draft is that adding the __deprecated__ attribute is not possible when the decorated object implements __slots__, or __setattr__ is otherwise disabled on the annotated object. It should be decided whether such objects are excluded, or if a fallback mechanism is used. (for instance, creating a wrapping class that has __deprecated__.)

I just opened a PR with changes to the PEP: PEP 702: Move to warnings, expand spec by JelleZijlstra · Pull Request #3442 · python/peps · GitHub :

  • Move the decorator to warnings, as that seems the most commonly supported option.
  • Mandate that @overrideing a deprecated method should trigger a warning, as discussed above.
  • Explain that syntax that triggers an indirect call to a deprecated method, such as a property setter, should also trigger a warning.
  • Explain that the decorator will fail if it can’t set the __deprecated__ attribute. Note that the PEP explicitly says that the decorator is only for functions, classes, and methods, all of which support setting attributes (barring exotic metaclasses).
3 Likes