PEP 702: Marking deprecations using the type system

I present PEP 702, which proposes to add an @typing.deprecated decorator that provides a way to communicate to static type checkers about deprecated functionality.

Let me know if you have any thoughts on this proposal.

There is currently one open issue: Should the decorator raise a runtime warning? I propose that it should not, but there are some strong arguments for doing this. I would be interested to hear more opinions. Previous discussion is in typing-sig.

17 Likes

The mailman archive seems incomplete. It doesnā€™t contain @sobolevnā€™s reply (and my subsequent, but a bit inconsequential reply). Maybe there are some issues with non-ASCII characters?

Notably, it does not issue a runtime DeprecationWarning.

Can it be managed by an interpreter cmdline option instead? I believe piggybacking -d or adding -W deprecated can be the fitting one.

It would benefit CPython itself replacing older warnings.warn() and newer warnings._deprecated().

Edit: whether we should hide stdlib deprecations by default like itā€™s proposed for the third party libraries, or do the opposite and always issue the warning for both, or leave these parts as is (non-uniform) is a subject of a separate discussion.

Warnings filter settings (which can also be altered with -W flag and PYTHONWARNINGS env var) already determine whether particular warnings are shown. If typing.deprecated raised warnings, why should there be a separate option for determining whether those specific warnings are shown? Note that this can be achieved simply by raising a subclass of DeprecationWarning as then you can filter based on that but Iā€™m unsure what would make warnings from typing.deprecated so special.

I followed the discussion on typing-sig and I can see some merit to having an opt-in way to raise a warning without having to manually call warnings.warn but Iā€™m not convinced that it warrants the increase in complexity. It would require adding 2 or 3 more arguments - an argument for determining whether a warning should be raised (Iā€™m more for opt-in than for opt-out), stacklevel (I have no clue what the default should be here, Python documentation suggests using stacklevel=2 for wrapper functions but that makes the warning point to the code of the function that is deprecated rather than the code that called that function; OT but IMO the current documentation, in addition to bad default, suggests an anti-pattern that makes it hard to pinpoint what uses the deprecated thing), and maybe category (to specify subclass of DeprecationWarning).

Additionally, it still leaves the inconsistencies that weā€™re bound to have when it comes to overload signatures where we canā€™t raise the warning since it would require verifying types at runtime.

I personally believe that @deprecated should have the option to issue a deprecation warning and should be placed in the warnings module. I suspect that even if it wonā€™t have such an option, IDEs, linters, etc. - and not only type checkers - will start to rely on the decorator. But users will want to issue warnings at runtime. This means either using multiple decorators or custom decorators. The latter wonā€™t necessarily be understood by tools, which could lead to monkey-patching of typing.deprecated. I believe just supporting warnings from the beginning would prevent all those potential issues. (Try to) do it right now instead of having to change it later.

15 Likes

The no-warning behavior in the PEP makes me uneasy.

The PEPā€™s boilerplate code looks reasonable to me and points out that it could be encapsulated in a library, but I donā€™t find the arguments against doing this in typing very compelling:

  • The point about generally avoiding runtime work is understandable, but I donā€™t think it aligns with the actual behavior of the library, e.g. assert_never() which rhymes with the proposed decorator in terms of use cases and has a much more drastic runtime effect than a warning would. On the other hand, I am much less familiar with the machinery than Jelle is, maybe I am missing some nuance here.

  • The points about edge cases donā€™t seem to be represented in the example boilerplate, so I think the ā€˜realā€™ idiom is more complicated than the PEP is letting on, and I think thatā€™s a point in favor of making it the standard libraryā€™s problem.

    • Maybe a wider survey of existing ā€˜in the wildā€™ implementations of this decorator is in order? For example, the Deprecated library referenced in the Rationale section has a more complex implementation, although not all of that is functionality that would be obligatory for an arbitrary end-user.

Overall, I very much like the idea of an official way to declare something as deprecated, but not being able to opt into warnings (Iā€™d prefer having to opt out of a warning but I figure thatā€™s a bridge too far here :sweat_smile:) seems to me like missing a trick and sticking the end user with the bill for the resulting complexity of there being two not-quite-equivalent Ways to Do It.

As someone who works on libraries without static type checking, I am uncomfortable with the absence of runtime effects of this PEP. It creates a situation where I can miss a deprecation because it is only communicated via typing.deprecated.

By way of example, not proposal, similar to __all__, there could be

__deprecated__ = ["Class", "function", "CONSTANT"]

making use of the deprecated items warn at import time. There are problems with this under this form (like the question of how to make it work for from ā€¦ import *) but you see the general idea. I think there are ways to communicate deprecations in a way that botj type checkers and the runtime will understand, and I would like to invite exploring them.

2 Likes

Based on the feedback here Iā€™m going to change the decorator to provide a warning, with an opt-out mechanism. Iā€™ll be back soon with details.

3 Likes

Thereā€™s a lot thatā€™s missing from this that would make me unlikely to use it in Flask.

I try to be as specific and helpful as possible in my deprecation messages, because otherwise (and regardless) I have to deal with more user reports. I use some common message patterns, like:

  • '{name}' is deprecated and will be removed in Flask {version}. Use '{other}' instead.
  • The '{name}' parameter is deprecated and will be removed in Werkzeug {version}. It's always enabled now.
  • '{name}' has been renamed to '{other}'. The old name is deprecated and will be removed in Jinja {version}.

I write mine manually for my specific situations, but SQLAlchemy has a whole set of decorators that can handle all different types of deprecations, check call arguments, and show specific messages at runtime and in documentation. sqlalchemy/deprecations.py at 586df197615d91af56aefc0d5ff94ceac13154eb Ā· sqlalchemy/sqlalchemy Ā· GitHub

Despite the stats you have, I do remove or rename arguments and attributes/properties, and move things between modules, often enough. I donā€™t plan to do it a lot, because itā€™s disruptive, but it still happens. Perhaps your stats are only considering the latest versions, whereas if you checked a few versions ago youā€™d see a lot more of them (theyā€™ve been fully removed at this point).

The stack level canā€™t always be assumed to be 2. Iā€™d need to go back and look, but I remember using 3 for base classes where you want to show the warning from __new__ or __init__ when a subclass inherit them.

We also have to consider imports, not only calls. Iā€™ve run into cases where a downstream library re-exported a name we deprecated. They never saw the warning during tests because they never called it themselves, so users broke when we updated even though nothing changed for them. (Yes, they should pin their dependency tree.) Now that module-level __getattr__ exists, we can make sure that warnings happen at import. This also helps for moves, allowing the old name to still work with a warning pointing at the new name. Iā€™d hope that something in the standard library would be able to warn on both import and use at runtime.

Finally, projects like Flask and SQLAlchemy already have a really hard time using typing correctly and keeping up with changes. Adding another decorator we have to use on top of all the work we already do for deprecations, and that users will complain about if we donā€™t use, isnā€™t really helping. What would help is basically including SQLAlchemyā€™s solution, or another solution, for full control over messaging about changing APIs.

Thanks for your feedback!

Adding a complex mechanism, like the SQLAlchemy code you link, to the standard library is risky because our backward compatibility constraints are such that we basically have to get it right the first time. In addition, I donā€™t have an appetite for the amount of consensus-building that would be required to come up with a more complex API. If someone else is willing to do it, I wonā€™t stop them.

My stats are based only on the standard library, which may not be representative. I do acknowledge that deprecations for the things you mention occur and it would be useful to mark them, but I would like to defer that because it introduces significant additional complexity.

I plan to provide a stacklevel= parameter to override the default. I will look out for edge cases like you mention when deciding on the default; sounds like it may have to be different for classes and functions.

When I presented a first version of this proposal at the typing-sig meeting, I suggested a ā€œdeprecated_transformā€ mechanism (similar to PEP 681ā€™s dataclass_transform) that would mark a third-party decorator as working like typing.deprecated. I took it out because the feedback was that it would introduce too much complexity, but it may be worth reconsidering. Would that alleviate your concern?

1 Like

Also note that gh-39615: Add warnings.warn() skip_file_prefixes support by gpshead Ā· Pull Request #100840 Ā· python/cpython Ā· GitHub just landed which lets you skip frames based on file prefix.

Very nice. Iā€™d consider accepting only skip_file_prefixes for @deprecated, except that then Iā€™d have to backport the feature to typing-extensions. Iā€™ll see if that is feasible.

Hi, Iā€™m currently working on a similar library.

The goal of codecrumbs is not only to provide deprecation messages and extend the docstring, but also to offer a way to fix the code which is using the deprecated apis.

It supports currently renaming of arguments and attributes, but the deprecation and replacing of functions is also planned (but more complicated). The library is also in an very early stage and is not advertised very much. But I would be happy about any feedback or beta tester :slightly_smiling_face:

It currently works by inspecting the code during runtime, but using static typing is maybe also a way this could be implemented.
The reason I did not chose this approach is that there are different static type checkers and it would require that the code which is checked uses type annotations. The benefit would be that the user does not have to run the whole test suite to fix the deprecated code.
I also have no idea how it would be possible to hook into the type checker to provide the code fixes for the user. If anyone has an idea please let me know.

I see currently no way to combine my library with this PEP, but maybe this can serve as an inspiration for new ideas.

I have now pushed a new version of the PEP that emits runtime deprecation warnings by default: PEP 702 ā€“ Marking deprecations using the type system | peps.python.org

The @deprecated parameter takes two keyword-only arguments:

  • category: A warning class. Defaults to DeprecationWarning. If this is set to None, no warning is issued at runtime and the decorator returns the original object, except for setting the __deprecated__ attribute (see below).
  • stacklevel: The number of stack frames to skip when issuing the warning. Defaults to 1, indicating that the warning should be issued at the site where the deprecated object is called. Internally, the implementation will add the number of stack frames it uses in wrapper code.

If the decorated object is a class, the decorator wraps the __new__ methods such that instantiating the class issues a warning. If the decorated object is a callable, the decorator returns a new callable that wraps the original callable but raises a warning when called. Otherwise, the decorator raises a TypeError (unless category=None is passed).

There are several scenarios where use of the decorated object cannot issue a warning, including overloads, Protocol classes, and abstract methods. Type checkers may show a warning if @deprecated is used without category=None in these cases.

I will release this version in typing-extensions soon.

2 Likes

One nit: At the beginning of the ā€œSpecificationā€ section, the PEP says:

The decorator takes a single argument of type str , [ā€¦]

and then later in the ā€œRuntime behaviorā€ subsection:

The @deprecated parameter takes two keyword-only arguments: [ā€¦]

Maybe it would be clearer to end the first paragraph with a sentence like ā€œThe decorator takes one required/optional(?) positional-only(?) parameter and two optional keyword-only arguments.ā€ And then start the second paragraph with ā€œThe positional-only(?) parameter is a str, ā€¦ā€

Good point, I forgot to update that part of the PEP. I will fix this.

Hi, I came across this because I wanted to build something like this, and it looks like itā€™s already under way! May I make the following suggestion:


@deprecated(warn_after=<date>, error_after=<date>)
def outdated_func():

I would like the decorator to do change its behavior based on the date, eventually leading to an exception. this allows me to gently nudge users, and give them time to switch their code to something else.

1 Like

It seems more standard to deprecate based on version, like how the deprecation package does.

2 Likes

I totally support what this PEP is trying to fix. I do believe a deprecated_transform, with a default implementation like the one you currently have in the PEP, would allow marrying the use case for newer projects with the needs of mature ones like SQLAlchemy or Twisted, which have included deprecation frameworks for years. Itā€™s additional complexity but itā€™s not that much additional complexity, and ignoring what those mature frameworks do is risky in its own way, as we might just end up repeating the same evolution they went through. In this sense, I agree with @davidism that looking closely at what they have is worth the while.

2 Likes

Will this be able to be used for things other than classes and functions? What about consts or class members? I think if we will be able to call deprecated as a normal function as well it could be expanded to those as well:

DEPRECATED_CONST: str = deprecated("foo")

class ConnectionType(StrEnum):
    Http = deprecated("Http")
    Https = "Https"