Exponentiating functions implements nested calls

would it be possible to allow all functions to over-ride __pow__ so that (f**2)(x) = f(f(x)).
makes code read more like math, easy way to do/read nested calls.

demo example


class Pow:
    def __init__(self, f):
        self.f = f

    def __call__(self, x):
        return self.f(x)

    def __pow__(self, n):
        if not isinstance(n, int) or n < 0:
            raise ValueError("power must be a nonnegative integer")

        def composed(x):
            y = x
            for _ in range(n):
                y = self.f(y)
            return y

        return Pow(composed)
    
    @classmethod
    def able(cls, f):
        return cls(f)

@Pow.able
def f(x):
    return x**2

[(f**k)(2) for k in range(4)]

Maybe it’s just me, but I think it would be a bit strange to jump straight to exponentiation when we can’t to do function composition, i.e., being able to write (f @ g)(x) (or some other operator) instead of f(g(x)).

5 Likes

i like that idea too.

Most functions are not mathematical functions where the range is a subset of the domain, so this would only apply quite narrowly.

I think this could make sense in a functional programming or symbolic mathematics library, but I don’t think it should be a built-in feature of functions. (It may even already exist in one of the many existing libraries: GitHub - sfermigier/awesome-functional-python: A curated list of awesome things related to functional programming in Python.) I think you need to opt-in to your functions being treated that way, as with your decorator.

6 Likes

great point.

If math, f**-1 would mean ā€˜inverse’, e.g., making (math.sin**-1)(x) equivalent to math.asin(x). For a programmed function (code, not symbolic), that’s not feasible (unless by interpolation, which would be slow and may give an unexpected answer).

For just positive powers, you could make a functional (not infix operator) func_pow(func, pow).

1 Like

A decorator that augments functions with these features is a very cool idea.

Just trying to specify what properties that decorator should have, I’m noticing some questions that don’t have an obvious best answer.

For example, to me it is quite reasonable that if f(a: int, b :int) -> tuple[int, int], then f@f and f**2 are meaningful. But I can’t see a good way to deal with the asymmetry in the sequence [None, int, tuple[int, int], tuple[int, int, int], tuple[int, int, int, int], …]

I haven’t been able to find a library that implements a decorator that attempts to do this, but I can’t see this[1] requiring any changes to core Python in the short term.


  1. developing the appropriate decorator that implements __matmul__, __pow__ and easier ā€˜partial’ construction ā†©ļøŽ

Can’t you have the decorator add an overload?

@overload
def wrapper(pair: tuple[int, int], /) -> tuple[int, int]: …

Then the implementation can check the arguments’ type and call the wrapped function accordingly.

Maybe I’m missing something. What do you mean with ā€œcheck the arguments’ type and call the wrapped function accordinglyā€?

The issue I’m seeing is that functions can be called with tuples as their args, you can’t just implement the rule ā€œif a tuple, then unwrapā€. If you did that, with

@func
def f(x):
  return 2*x

f((2,3))

would crash.

Indeed, that complicates and necessitates discerning between

  • unary ((T)->T) functions (let’s call them type EndoFunction[T] = Callable[[T], T])
  • and n-ary ((*Ts)->tuple[*Ts]) functions (type TupleFunction[*Ts] = Callable[[*Ts], tuple[*Ts]])

not by type, but rather by inspection of the signature of the function that’s to be raised to a power, like:

def is_unary_function[**PS, RT](function: Callable[PS, RT]) -> bool:
    function_params = inspect.signature(function).parameters.values()
    param_counts = collections.Counter(param.kind for param in function_params)
    pos_params_count = (param_counts[inspect.Parameter.POSITIONAL_ONLY] 
                        + param_counts[inspect.Parameter.POSITIONAL_OR_KEYWORD])
    return pos_params_count == 1

With that is_unary_function helper and the defined types, we can use below function-to-the-power func_pow function (with some casts to help mypy understand):

@overload
def func_pow[T](function: EndoFunction[T], power: int) -> EndoFunction[T]: ...  
@overload
def func_pow[*Ts](function: TupleFunction[*Ts], power: int) -> TupleFunction[*Ts]: ... 

def func_pow[T, *Ts](function: EndoFunction[T] | TupleFunction[*Ts], power: int) -> EndoFunction[T] | TupleFunction[*Ts]:
    if is_unary_function(function):
        @functools.wraps(function)
        def wrapper(arg: T, /) -> T:
            for i in range(power):
                arg = cast(EndoFunction[T], function)(arg)
            return arg
    else:
        @functools.wraps(function)
        def wrapper(*args: *Ts) -> tuple[*Ts]:
            for i in range(power):
                args = cast(TupleFunction[*Ts], function)(*args)
            return args

    if power < 0:
        raise ValueError
    return wrapper

This version of func_pow tests and type-checks okay for each of the aforementioned examples:

def square(x: int) -> int:
    return x ** 2

assert func_pow(square, 2)(3) == (3 ** 2) ** 2  # squared twice

def pair_result(x: int, y: int) -> tuple[int, int]:
    return divmod(x, y)

assert func_pow(pair_result, 2)(28, 5) == (1, 2)  # not necessarily sane, but correct (28,5)->(5,3)->(1,2)

def double(x: int | tuple[int, ...]) -> int | tuple[int, ...]:
    return 2 * x

assert func_pow(double, 0)(3) == 3  # doubling zero times, i.e. identify function
assert func_pow(double, 2)(3) == 12  # doubling an int twice
assert func_pow(double, 0)((2, 3)) == (2, 3)  # doubling zero times, i.e. identify function
assert func_pow(double, 2)((2, 3)) == (2, 3, 2, 3, 2, 3, 2, 3)  # doubling a tuple twice

That could be the operational basis for a decorator that enhances a decorated function from straight FunctionType to a Callable subtype that supports @ and ** (and perhaps currying).

1 Like

When it comes to ā€œreading like mathā€, perhaps look to Mathematica to see what they do. Instead of overloading exponentation, it has a built-in symbol called Nest. That is nicely self-explanatory and would work for all callables, not just function objects.

Besides, people will be more open to an addition to functools than to a new core language feature.

1 Like