Taking the argument signature from a different function

First let me make clear: I’m not writing a decorator.

This is a real issue from my cattrs library. I have a class with a certain __init__ signature. I have many functions, in different modules, essentially wrapping this __init__. These functions all take *args, **kwargs and use them to call this __init__. I want all of these functions to have the same input parameters as this __init__ at type-checking time, and not *args: Any, **kwargs: Any.

One solution is just to copy/paste the arguments and their types. This kinda sucks because I’d have to duplicate a lot of code and I don’t know how to ensure they’re in sync going forward.

I tried just using functool.wraps, here’s a simplified example:

from functools import wraps
from typing import Any


def inner(a: int) -> int:
    return a


@wraps(inner)
def wrapper(*args: Any, **kwargs: Any) -> None:
    inner(*args, **kwargs)
    return

But this doesn’t seem to work with Mypy (the signature of wrapper is unchanged). Pyright thinks it’s a _Wrapped[(a: int), int, (*args: Any, **kwargs: Any), None], so maybe slightly better, but autocomplete in VS Code still claims the signature is (*args: Any, **kwargs: Any).

Am I missing something? Does the Python typing system have a way of doing this?

4 Likes

I’m not aware of a utility type that performs this but it sounds like what you want is the Parameters utility type from Typescript.

I’ve missed having this in Python as well so that when parameters change it doesn’t cause as much churn in large code bases.

I’m not sure I fully understand what you’re trying to do here. It looks odd that you’re “wrapping” a function that returns a non-None value but then discarding the return value and returning None instead.

Does this meet your needs?

def my_wraps[**P](
    inner: Callable[P, Any]
) -> Callable[[Callable[..., None]], Callable[P, None]]:
    def impl(x: Callable[..., None]) -> Callable[P, None]:
        return inner

    return impl

@my_wraps(inner)
def wrapper(*args: Any, **kwargs: Any) -> None:
    inner(*args, **kwargs)
    return

Is this an issue with Python or with the tools (i.e. VS Code, pyright)?

functools.wraps should update the signature correctly, and in a jupyter notebook that appears to be the case:

from functools import wraps

def inner(a: int) -> int:
    """inner docstring"""
    return a

@wraps(inner)
def wrapped(*args, **kwargs):
    return inner(*args, **kwargs)

wrapped(

When I hit shift-tab to get the signature, I see

Signature: wrapped(a: int) -> int
Docstring: inner docstring
File:      /var/folders/v4/tx_h1xqj1t741r3n1zfpg_lw0000gp/T/ipykernel_81241/3276479464.py
Type:      function

So it seems like everything worked as you wanted, but these tools don’t agree. Perhaps this is because wraps is updating signature at runtime and the typing tools don’t do that–possibly it would need special-cased logic.

If the idea is to not return the same thing and so the signature changes, then Eric’s version works and wraps incorrectly says that the return type is int. This requires 3.12 syntax though.

1 Like

I am not sure if this is what OP is talking about, but this is something I have also run across when trying to type hint alternate constructors:

class Parser:
    def __init__(self, grammar: str, *, option_1: int=..., option_2: str=..., option3: Literal[...] = ...):
        ....
   @classmethod
    def from_file(cls, grammar_path: PathLike | str, **kwargs) -> Self:
        return cls(open(grammar_path).read(), **kwargs)

How to tell the type checkers that kwargs should be the same as the keywords for __init__?

This is a slight special case where the *args part changes. But even when that doesn’t change: How to do this?

Pyright is applying the correct type evaluation rules and producing the correct signature for functools.wraps based on static analysis — and based on the documented semantics of functools.wraps. This isn’t a situation where special-cased logic would produce better results.

Consider the following variant:

from functools import wraps

def inner(a: int) -> int:
    return a

@wraps(inner)
def wrapped(x: str, y: str) -> int:
    return inner(int(x) + int(y))

print(wrapped("1", "2"))

Note that the decorated function wrapped retains its original undecorated signature. It doesn’t adopt the signature of the function it’s wrapping (in this case, inner).

In a Jupyter notebook, you are running a Python environment, and the tools can introspect the actual type of the live object. Static analysis tools cannot do this. They need to base their analysis on static type information.

It’s trivial to make this work prior to 3.12. Just use the old way to manually define a ParamSpec.

You can define a TypedDict that contains the common set of keyword args and use it to annotate a **kwargs in both the __init__ and from_file signatures.

1 Like

If I understood you correctly, you are creating many instances of this class across several modules and you need them all to be guaranteed to be the same by type checker and moreover at once?
Although I am not sure why type checker specifically, but maybe you actually should rethink your design. Instead of instantiating your class multiple times, maybe you should pass the instance in as an argument, which will remove your issue.
Also if you actually want to take advantage of type checking engines, I wouldn’t write abstract argument types like *args, unless you actually passing variable number of arguments to the function or you need those for inheritance stuff. Perhaps, passing a list of specific type is a better option or just making a custom type that could be type checked.

Are you requesting something similar to:

?

Yeah, in the original example I need to change the return type (to be a subclass with extra methods).

Yes, thanks! I guess a decorator was the solution all along. I tweaked it a little and ended up with:

def wrap(_: Callable[P, Any]) -> Callable[[Callable[..., T]], Callable[P, T]]:
    """Wrap a `Converter` `__init__` in a type-safe way."""

    def impl(x: Callable[..., T]) -> Callable[P, T]:
        return x

    return impl

Ok, my problem being solved, I guess I am confused at why functools.wraps doesn’t do this automatically in a type-checking context (ignoring the return type). Maybe it’s me misunderstanding what functools.wraps is supposed to do; the docs aren’t very clear.

Right, but why?

I already have a different API that does this, but it’s worse because I need to actually tweak the parameters a little bit before passing them on, and because it requires an additional import so ergonomics suffer.

Yeah, the use cases seem similar.

That’s how functools.wrap works. This function is intended to decorate a wrapper function that wraps some inner function. It’s possible for the signature of the wrapper function to differ from the wrapped function. The final signature therefore needs to retain the wrapper’s signature (which is the signature that’s visible to callers). It should not reflect the signature of the wrapped function, which is hidden to callers.

Roger. I’m just saying I find it confusing.

So the takeaway is that functools.wraps just modifies inspect.getsignature but doesn’t affect typing tools. My intuition is trained to expect static typing to try to follow what’s happening in runtime. Would it be a big deal to make the typing tools mimic the runtime effect?

2 Likes

Would it be a big deal to make the typing tools mimic the runtime effect?

Pyright is honoring the runtime behavior here. The runtime behavior exposes the signature of the wrapper, not the wrapped function. The two signatures can be different, as shown in my example above. In your example at the top of this thread, the signature of the wrapper is (*args: Any, **kwargs: Any) -> None, and that’s the signature that pyright (correctly) evaluates for the decorated function. If a static type checker were to ignore the signature of the wrapper and instead expose the signature of the wrapped function, then it would be deviating from the runtime behavior.

I think there are two differing definitions of “the runtime behavior” being used here.

The supported way to introspect the signature of a function at runtime is inspect.signature. It takes a from_callable flag, which defaults to True. This means that it will show, at runtime, by default, the signature of the wrapped function, not of the wrapper. This strongly suggests that functools.wraps should also imply “same signature.”

I am not sure if this behavior of inspect.signature is correct/ideal, or how well it matches real-world usage of functools.wraps (i.e. how often is it used for signature-changing wrappers?). For a user of functools.wraps to avoid this default behavior of inspect.signature, they need to explicitly set a __signature__ on the wrapper function.

So the question here is whether type-checkers’ understanding of signatures should (or even can) match this runtime behavior of inspect.signature.

1 Like

This came up for me while working on pyanalyze, a static (mostly) type checker that looks at runtime function objects. We use inspect.signature to get the signature of functions, but should we use follow_wrapper=True (i.e., get the signature for the wrapped function when functools.wraps is involved) or False (i.e., use the signature of the wrapper). Ideally we’d use True, because the wrapper’s signature is often an uninformative *args/**kwargs, but that caused problems in practice because it’s in fact common for wrapper functions to modify the signature. Common examples are @contextlib.contextmanager (which changes the return type) and @unittest.mock.patch (which often adds a parameter).

2 Likes

Per the documentation, functools.wraps is just a convenience function for updating the signature to match the wrapped function. So it’s kind of odd to me that it doesn’t do that in other cases.

It does seem like Eric’s solution is the right answer–a decorator that is sort of like wraps but does something a little different.

It seems like if the signature changes, wraps isn’t the right decorator to use, and the other option is a custom one like Eric;s. I don’t know if that’s common enough to justify adding it to functools, but maybe?

AFAICS the documentation for functools.wraps (and functools.update_wrapper) never explicitly mention signatures at all. Which makes sense, because it currently doesn’t do anything with signatures! It does, however, set the __wrapped__ attribute to the wrapped function, and inspect.signature then (optionally, but by default) follows that instead of returning the wrapper signature.

I don’t think we need a new decorator, regardless. functools.wraps is, by design, already quite flexible; it allows you to specify precisely which attributes of the wrapped function to copy to the wrapper. It could easily be given a way to control setting __signature__ on the wrapper, too.

Carl

1 Like

Okay fine it updates all the function metadata, from which the signature is derived on inspection :sweat_smile: I get the distinction you’re making though.

While we hash this out, I’ve released tightwrap · PyPI real quick as an experiment.

2 Likes

We have a very similar use case. We want to take argument signature from parent class when we override the __init__ in subclasses. But we also want to extend the signature a little bit.

We asked if it’s possible to achieve this by adding a bound parameter to ParamSpec: Proposal: Add `bound` to `ParamSpec`

Seems the main issue is that currently we need a better way to express a function signature in a pure typing environment.

1 Like