Proposal: Add `bound` to `ParamSpec`

Hi all, Is it feasible to add a bound parameter to ParamSpec? Sometimes, manually specifying the bound can greatly reduce the difficulty of using ParamSpec.

For example, I have a base class defined like this:

class IntView:
    def __init__(self, init_data: int):
        ...

Then I want to inherit IntView like this:

class RangedIntView(IntView):
    def __init__(self, init_data: int, min_value: int, max_value: int):
        super().__init__(init_data)
        self._min_value = min_value
        self._max_value = max_value

Since the parameters of IntView.__init__ may be changed in the future, I want to use *args and **kwargs in RangedIntView.__init__ so I don’t need to adjust RangeIntView.__init__'s parameters then. Like this:

class RangedIntView(IntView):
    def __init__(
        self,
        min_value: int,
        max_value: int,
        *args,
        some_key_arg: bool = False,
        **kwargs,
    ):
        super().__init__(*args, **kwargs)
        self._min_value = min_value
        self._max_value = max_value
        self._some_key_arg = some_key_args

Now I want to add type hints to RangedIntView.__init__. I want to borrow the type hints from IntView.__init__. Something like this:

P = ParamSpec("P", bound=IntView.__init__)

class RangedIntView(IntView):
    def __init__(
        self,
        min_value: int,
        max_value: int,
        *args: P.args,
        some_key_arg: bool = False,
        **kwargs: P.kwargs,
    ):
        super().__init__(*args, **kwargs)
        self._min_value = min_value
        self._max_value = max_value
        self._some_key_arg = some_key_args

We can further add a Super then we can reuse this ParamSpec:

from typing import Super


P = ParamSpec("P", bound=Super)

class RangedIntView(IntView):
    def __init__(
        self,
        min_value: int,
        max_value: int,
        *args: P.args,
        some_key_arg: bool = False,
        **kwargs: P.kwargs,
    ):
        super().__init__(*args, **kwargs)
        self._min_value = min_value
        self._max_value = max_value
        self._some_key_arg = some_key_args

    def some_other_function(self, arg_1: int, *args: P.args, **kwargs: P.kwargs):
        # do something with args_1
        super().some_other_function(*args, **kwargs)

Feedback appreciated. Thanks!

2 Likes

bound, covariance and contravariance have been intentionally left out of scope for PEP612 for a future PEP to specify, since they’re not trivial to specify, so this has been on people’s minds, but nobody has had a good enough solution yet to write up a PEP.

The issue is we don’t really have a way yet to refer to the complete signature of a function in a typing context. Type checkers keep track of the various argument types internally, but there’s no way to say for example “I want a Callable with a keyword only argument foo” without using a callback protocol, which works in some contexts, but is insufficient in others.

So I think the first step should be to come up with a way to extend how we talk about function signatures in a typing context. mypy has a deprecated feature for this which never really took off and as a result is not compatible with ParamSpec: Additional features - mypy 1.7.1 documentation

But I think we would need something along those lines to have a satisfying solution, since implicitly treating callable runtime variables as valid upper bounds for a ParamSpec seems ambiguous. To top it off there’s also the issue of function overloads and how those should be treated/expressed in a pure typing context.

2 Likes

Thanks for your explanation!

However, overriding or just passing other function’s kwargs is an extremely common use-case. I don’t get it how ParamSpec went full development cycle strait to the release without getting it. The most-obvious solution is the following:

def fn_a(*, foo: str = None, bar: str = 'beer'):
    ...
def fn_b(x: int, y: str, **kwargs: ParamSpec(fn_a).kwargs):
    print(x, y)
    fn_a(**kwargs)

Alternative forms:

**kwargs: fn_a.kwargs
**kwargs: ParamSpec.from_callable(fn_a).kwargs

This should result to fn_b having the following signature:

def fn_b(x: int, y: str, *, foo: str = None, bar: str = 'beer') -> None:
   ...
2 Likes

While I don’t disagree that it’s a common use-case I don’t think it’s obvious at all how this should be spelled. ParamSpec is already quite a big PEP and took quite a while for type checkers to add full support to. TypeVarTuple is in the same boat, it didn’t specify bound yet either. I am not sure either of those PEPs would have been accepted, if they had defined semantics for bound, since it’s not obvious what those semantics should be, since there are more complex cases than just “create a param spec based on a function’s signature”.

Also besides fully copying another function’s ParamSpec you actually do want to be able specify some optional/named/variadic parameters when binding a generic without having to write a dummy function/protocol, just so you can convert it into a ParamSpec with an upper bound.

Also your Proposal completely disregards PEP696, what do you do if you not only want to change the bound, but also the default?

1 Like

I get the sense that ParamSpec was built mainly with decorators in mind, i.e. when you have a fn: Callable[P, T] already in scope. I like your example of ParamSpec(fn_a).kwargs and it reminds me of one of the rejected ideas in PEP 612: ParametersOf[...].kwargs

I found this topic while pondering the inability to use ParamSpec to modify arbitrary CallT = TypeVar("CallT", bound=Callable), e.g. when trying to write generic code that supports both callback protocols and simple Callables – I’m running into the issue that there’s no way to get a ParamSpec out of a CallT.

If we had ParametersOf, your example would work using the fn_a directly as a function literal.

def fn_a(*, foo: str = None, bar: str = 'beer'):
    ...
def fn_b(x: int, y: str, **kwargs: ParametersOf[Literal[fn_a]].kwargs):
    print(x, y)
    fn_a(**kwargs)

or we could create a Protocol for any kind of function with that signature

class FooBarCallable(Protocol):
    def __call__(*, foo: str = None, bar: str = 'beer'): ...

def fn_b2(x: int, y: str, **kwargs: ParametersOf[FooBarCallable].kwargs):
    print(x, y)
    fn_a(**kwargs)

or we could make fn_b3 work with any sort of callable/callback protocol:

def fn_b3[F: Callable](any_f: F, x: int, y: str, *args, ParametersOf[F].args, **kwargs: ParametersOf[F].kwargs) -> ReturnType[F]:
    print(x, y)
    return any_fn(*args, **kwargs)

As I understand it, none of these examples work with ParamSpec, and even adding a bounds= parameter doesn’t allow for referencing the return type of a callable (a similar limitation).

When discussing ParametersOf and ReturnType, the authors of PEP 612 write:

In summary, between these two equivalently powerful syntaxes, ParamSpec fits much more naturally into the status quo.

but while this might be true in the context of decorators, I’m not seeing how ParamSpec is sufficient for the general problem of dealing with arbitrary function signatures in other contexts such as inheritance or callback protocols.

3 Likes