Introduce funnel operator i,e '|>' to allow for generator pipelines

Thank you for the warning.

I think the fact there is a pushback is a good thing - this ensures that only truly beneficial and well thought out solutions have a chance of making it. And the fact that such have a big impact on how users write code makes pronounced resistance very sensible.


“R” and “Matlab” were my first languages, and as far as I remember, the biggest obstacles shifting to Python were all the conveniences that I had to give up. The fact that I did make the shift just shows that it was definitely worth it for me, but it would be great to have the best of all worlds.

Python being “general purpose programming language”, I think it is natural that all areas need to grow and I hope advancements in “functional programming” are not going to be left too far behind.

2 Likes

That exists and is called functools.partial. Since Python 3.13 there is a functools.Placeholder value which can be used for positional args placeholder. Keep in mind that it is possible to do

from  functools import partial as p, Placeholder as ___
from pipe impor Pipe, execute

result = Pipe(source)| p(operator1, constant, ___) | p(operator2, ____, constant) | execute

(Or omitting the execute stage so that the Pipe object can work as a generator.)

2 Likes

It is possible to implement a variant of what you are suggesting while staying efficient by introducing two new opcodes. I will give it a try.

While strictly speaking it’s not really a “reverse call”, I can see what you mean. I think I can provide some degree of flexibility while staying reasonably efficient.

I would not go into capturing the AST though or a mix of objects and AST into “components”. What I can do is provide RHS evaluated apart from the final call to the __rcall__ of the LHS as func, *args, and **kwargs. This would be the most natural and closest to overriding the regular __call__

Regarding design, since |> is just another way of making a call, the only suitable analogy is the primary '(' arguments ')' syntax itself - you cannot have a “call” that uses square brackets. It is simply not that level of syntax. A primary is closely coupled to round braces in order to make a call - it’s just the way it is.

In summary, no special components are needed. The functionality you describe can be provided to large degree with a small addition to the existing implementation.

It will indeed enable more/different things to be built on top of it - that’s for sure.

No, it is not. __rcall__ is not the best name for it. __rpipe__ or __rapply__ would be more accurate I suppose.

So:

print(ast.dump(ast.parse('a |> list()'), indent=4))
Module(
    body=[
        Expr(
            value=BinOp(
                left=Name(id='a', ctx=Load()),
                op=Pipeline(),
                right=Call(
                    func=Name(id='list', ctx=Load()))))])

So this says it is a binary operation.

In Python, for any binary operation, both operands are independent.
I.e. Are meaningful on their own.

While for your case, this is broken. And furthermore, ast.Call on RHS is incomplete. So such coupling is breaking a lot (well 2 to be more precise :)) of design conventions, which I don’t think can be broken without a very very very good reason.

ADDITION: Ah and also, all BinOp have their magic methods I think?

AST_OPS_BIN = {
    '&': ast.BitAnd(),
    '|': ast.BitOr(),
    '^': ast.BitXor(),
    '<<': ast.LShift(),
    '>>': ast.RShift(),
    '+': ast.Add(),
    '*': ast.Mult(),
    '-': ast.Sub(),
    '/': ast.Div(),
    '//': ast.FloorDiv(),
    '@': ast.MatMult(),
    '%': ast.Mod(),
    '**': ast.Pow()
}

Yes, your approach takes very performant path, however, the way I see it is more suitable to be fast past optimisation as opposed to concept which should lead the design.

E.g.:

a |> list

would satisfy the independence. as both a and list are meaningful on their own and operator just performs an action with 2 fully evaluated objects on each side.

Thus, it means that in the same spirit, RHS of:

a |> func(_, 1)

should be a meaningful object in itself.

The only path to that is if func(_, 1) was a syntactic convenience for something else, which naturally is partial(func, Placeholder, 1).

Thus, _ can not be used for this purpose, because func(_, 1) is currently meaningful construct.
And this is why I suggested to develop “new partial” or extend existing “partial” in parallel to what you are doing (as current partial is lacking few features to satisfy everything you have in mind).

For example:

func(~, 1)

would evaluate to partial(func, Placeholder, 1) independently from Pipeline(). While:

obj |> func(~, 1)

would skip partial(Placeholder, 1) construction and take fast path of what you have done, which would be seen as compiler level optimisation as opposed to mental model of how it works.

1 Like

Yup, exactly. What is in question is the convenience for above.

result = source |> operator1(constant, ~) |> operator2(~, constant)

Whether it is the right time for it now (or ever?) or if current one in consideration is the right solution I am not sure. I am still processing I guess.


P.S. trailing Placeholders are not allowed. The working code would be:

result = Pipe(source) | p(operator1, constant) | p(operator2, ____, constant) | execute

There are a lot of ways to achieve this. And I had used a similar recipe earlier in this thread. Although python being a flexible language, allows users to create different constructs, the point is, there are a lot of very popular libraries using this particular pattern (I have shared examples of such libraries earlier 1 ), it probably makes sense to have a built-in, standardized, easy-to-use and performant way to cater that need.

I say libraries, since they reach wider audience, but there are a lot of people (including me), who would like to have more functional patterns supported in the language. Now, it is perfectly fine if python doesn’t want to support any more functional patterns than what is already supported. And it is also okay if this can’t be added now as it may require more thoughtful design, etc. But I believe, given the amount of usage in the wild (even considering the overhead the current workarounds add), and the elegant patterns that this construct can achieve, the need is not unjustified.

EDIT:
Actually the example I shared were in a different thread. `functools.pipe` - #28 by sayandipdutta

3 Likes

I think I didn’t get your original intent exactly right. Thank you for the clarification. In this case, I would even not introduce a special magic at all.

It’s purely an implementation detail. I squeezed it as a BinOp as it required less invasive modifications to the parser but you are right - to keep things logical it would be better to have this:

            value=Pipeline(
                left=Name(id='a', ctx=Load()),
                right=Call(
                    func=Name(id='list', ctx=Load()))))])

no magic method needed.

While I agree that the concept should lead the design and not vice versa, my concept is simply different - another way of orchestrating a call, where one argument is being populated by the LHS. There is no such thing that currently exists in Python - I agree.

I think your concept is valid and it’s great that you shared those thoughts! I am not sure myself anymore which way would be better - let’s think this through.

  1. How would your suggest to replace this syntax: "foo bar" |> _.split() ?
  2. Parsing of calls depending on the presence of ~ in the argument list seems to require some advanced PEG, have not wrapped my head around it yet.
  3. How would the “new partial” accept the placeholder if called from outside a pipeline? As first argument?
  4. The overall design would clearly need to be much heavier with every RHS wrapped as a partial first. Python compiler features simple optimizations like opcode specialization, tier 2 executors and JIT. It does not feature any optimizations resembling the kind that you mentioned (avoiding partial creation when part of pipeline). I am making an educated guess that it would be extremely complex to add this kind of machinery compared to what I proposed so far. In fact, it would be easier to optimize that already at the parser level but then it would clearly complicate the parser extraordinarily (if possible at all).
  5. Therefore, I believe we would have to live with a less optimized code which is not necessarily a blocker since it’s not like pipelinining is integer arithmetic. It doesn’t strike me as something that needs to be heavily optimized - normally, the heavy compute that is happening in the components of the pipeline would far outweigh some simple wrapping we do. If one needs to add integers, pipelines are not the best fit :slight_smile:

To summarize, I am not opposed on principle to the alternative concept but at first glance it comes with some challenges and compromises. The challenges (_.split()) I would like to address. With the compromises, we would have to live with IMHO.

Finally, we could find some middle ground, e.g. switch the current pipeline from a BinOp to being its own “op” (Pipeline) in order to prevent users from making any BinOp-typical assumptions. We could also switch _ to something else, as currently I am not chasing the occurences of _ nested in the argument list, e.g. ... |> func((x := _), x) would most likely behave unintuitively in that it would use the local _ not the “placeholder”.

Yup, it is indeed so. I am just trying to point out the alternative and figure out why and which path would be better.

If someone really needs something like this:

mc = operator.methodcaller
"foo bar" |> mc('split')

However, this is not needed in practice as it is the same as:

"foo bar".split()

Yeah, this one would need to be figured out.
One thing is clear, optimization would apply to case where “one and only one” placeholder is present in partialization. Otherwise, if more than 1 is present, allow error to be raised via standard track.

So there would need to be a syntactic convenience:

foo(~, 1)

would evaluate to

partial(foo, Placeholder, 1)

Also, current partial would need to be extended for keyword argumens, args and kwds, but it is doable and could be invoked via convenience as:

foo(a=~)
foo(*~)
foo(**~)

Maybe it is not what you are asking? Some example to show what you mean would be helpful.

Yes, exactly. But if optimization is possible to avoid intermediate construction, then this would deliver the best of both worlds:
a) performance of what you have now
b) modular design, where all parts can be used independently of each other

Well yeah, I haven’t gone down that path, and not sure what would be the optimal way to do it, but the logic is quite simple:

LHS |> RHS
if type(RHS) is partial and RHS.count(Placeholder) == 1:
    return pipe_partial_rhs(LHS, RHS)
else:
    return pipe_default(LHS, RHS)

(likely syntactic shorthand for partial would need its AST.)

That is true. I would leave optimization for later. And I would start with implementation of the operator. If it is there, then it is already useful and can be used in combination with current partial.

If this path is taken, the way I see it now, “operator” is the most certain part of this, while partial extensions and syntactic conveniences, optimizations, etc could still use some further thought.

The only question is I think: “With or without magic method?”
I would say with and have both right and left variants with logic which is copy paste of any other standard binary operator. Due to both:
a) Functionality - useful for custom DSLs and new innovative ways to make use of it.
b) Just following existing logic of other operators would likely be easier than making an exception

(b) above. Introducing something that aligns with what is currently there and is consistent is much less risky. There are 4 op groups: UnoOp, BinOp, BoolOp and Compare. To create a new special case that doesn’t fall into these is a risky move.

But I am not against exploring this path, maybe there are factors unknown that would end up justifying such move.

Yes, I agree. This deserves some more thought. My ~ is just the best one that I have so far. If there is a better syntax - even better.

By the way, this is POC how “new partial” could look like. See doctests.

Attempt This Online!

Passing method names as strings is not great and the intent is less obvious than in _.split().

Due to Python’s pronounced OO side, it is not a trivial bonus to have this. It allows to work with methods-based transformations as easily as with functions-based and seemlessly switch between the two without squeezing mc() everywhere, e.g. pd.read_csv(...) |> _.query("A > B") |> _.filter(items=["A"]) |> _.to_numpy() |> _.flatten() |>_.tolist() |> map(lambda x: x + 2) |> list() |> np.array() |> _.prod()

The concept should drive the design but realistic constraints must guide both the design and the concept. There are complexities here that need to be addressed concretely, we cannot say that they “just need to be figured out” without sounding a bit nonchalant :wink:

This is not such a big issue for me as initially (assuming that a new opcode would be accepted). I’ve introduced a COPY_NULL opcode which allows to grab values from the stack quite conveniently relative to the last PUSH_NULL. I could manage multiple placeholders but can also limit to one to avoid unnecessary confusion.

Indeed I just wanted to know if I would call x = func(1, 2, 3, ~, 4, 5, kw_argument=6) like this x(value_of_the_placeholder, extra_argument, extra_kw_argument="foo") outside of the pipeline. I assume that this would be the case.

As mentioned, real world constraints must inform both the design and the concept itself. This optimization is outside of the scope of the compiler for sure. It would most likely need to happen at AST building or post-processing steps (likely better). I can see this happening as an AST post-processing step (as stripping of partials when in a pipeline).

Agreed, this likely could be achieved in an AST post-processing step.

Agreed, however without the new partial it is not very different to me than the existing libraries. Hardly justified to implement on its own IMHO.

This entire idea is about syntactic convenience, I don’t think the operator on its own cuts it - at least not for me.

Magic methods are probably fine.

I think whatever makes sense all things considered should be the way to go. There is still some thinking left to do.

:+1:

:+1:

There is one more thing related to the “Introducing something that aligns with what is currently there” mantra. Currently nothing in the Python syntax produces complex objects using opcodes. It’s important to understand the nuance here between having syntax for creating objects and now introducing new syntax for creating objects using a completely different logic. Where would the new partial be implemented? Would it be a new built-in type? There is nothing in Python currently that would produce instances of packaged classes using opcodes. Wouldn’t a closure with a special flag be a better match?

1 Like

Yeah, you are right. From your example below I can clearly see the benefit of this.

So probably not as convenient, but mid-ground could be independent utility such that:

MC = MethodCaller()
mcall = MC.filter(items=["A"])

# where mcall does:
def mcall(obj):
    return obj.filter(items=["A"])

# such that the following works:
... |> MC.filter(items=["A"])

Not as convenient and integrated as per your implementation, but I don’t think it is too bad. I am using similar approach for convenient predicate making and am quite content with it.

I think what I meant was “needs to be investigated” as I don’t have an answer to this. :slight_smile:

Understandable. It would also be easier to see benefits if the whole thing is presented in 1 go.

I see your point. Indeed, this would avoid one “new thing”, but would introduce another.

Not sure which one is “better”.

I was thinking that new partial could be implemented in its own separate module and functools could alias it.

It might be. I will give a bit more thought to the whole thing.

So regarding possible convenience for partial.

If it had its own syntactic convenience then it would ideally be somewhere alongside FunctionType. Then functools would alias it and offer programatic access to it.

Also, in this case it would need a well thought out spec, which covers all bases. E.g. C++ allows for indexed placeholders, thus it might make sense if Python being higher level language was to offer at least the same level of flexibility.

Thinking something along these lines:

def _passthrough(*args, **kwds):
    return args, kwds

# Sequential mode
foo3 = _passthrough$($, 1, $, *$, e=$, f=6, **$)
result = foo3(0, 2, [3, 4], 5, {'g': 7}, h=8)
print(result)
# ((0, 1, 2, 3, 4), {'e': 5, 'f': 6, 'g': 7, 'h': 8})

# Indexed Mode
# The above is same as:
foo3 = _passthrough$($1, 1, $2, *$3, e=$4, f=6, **$5)

# Allows pointers to the same placeholder
foo2 = _passthrough$($2, $3, 'a', *$1, **$2, c=3, **$3, d=$4)
result = foo2([0, 1], {'a': 1}, {'b': 2}, 4, e=5)
print(result)
# (({'a': 1}, {'b': 2}, 'a', 0, 1), {'a': 1, 'c': 3, 'b': 2, 'd': 4, 'e': 5})

Indexed mode could be left for later.

Final result would look something along the lines of:

MC = MethodCallerUtility()

arg |> MC.method() |> foo$($, 1) |> bar$(*$) |> baz$(**$)  # |> zoo$(***$)

I would forget about package-level “partial”. That’s too much for the most basic syntax. In fact what I outline below already feels somewhat overwhelming. The package-level stuff could be utilities built on top of what we discuss here but they would not be an (indispensable) part of it.

How about using ` ?

One could transform (at AST post-processing stage) `rhs(a1, a2, ~, a3, k1=1, k2=~, k3=3) into a variant of lambda (only at AST level - for combining with |> later; post-optimization, only regular lambas would be passed to the compiler for bytecode production):

pipelambda lhs, in_pipeline=False: \
    (
        rhs(a1, a2, lhs, a3, k1=1, k2=lhs, k3=3)
            if not in_pipeline
                else (
                    lhs.__lcall__(rhs, a1, lhs, a3, k1=1, k2=lhs, k3=3)
                        if hasattr(lhs, "__lcall__")
                            else (
                                rhs.__rcall__(lhs, rhs, a1, lhs, a3, k1=1, k2=lhs, k3=3)
                                    if hasattr(rhs, "__rcall__")
                                        else rhs(a1, lhs, a3, k1=1, k2=lhs, k3=3)
                            )
                )
    )

The method invocation could use double `, i.e. ``. It would be simply ``method_name or ``method_name(...). It would translate to:

lambda lhs, in_pipeline=False: \
    (
        getattr(rhs, method_name)(a1, a2, lhs, a3, k1=1, k2=lhs, k3=3)
            if not in_pipeline
                else (
                    lhs.__lcall__(getattr(rhs, method_name), a1, lhs, a3, k1=1, k2=lhs, k3=3)
                        if hasattr(lhs, "__lcall__")
                            else (
                                rhs.__rcall__(lhs, getattr(rhs, method_name), a1, lhs, a3, k1=1, k2=lhs, k3=3)
                                    if hasattr(rhs, "__rcall__")
                                        else getattr(rhs, method_name)(a1, lhs, a3, k1=1, k2=lhs, k3=3)
                            )
                )
    )

My fancy example could now be written:

`pd.read_csv(...) |> ``query("A > B") |> `filter(items=["A"]) |> ``to_numpy |> ``flatten |> ``tolist |> `map(lambda x: x + 2) |> list |> np.array |> ``prod

|> could be made to accept raw callables (i.e. list instead of `list) or not. I think it’s fine to accept them and just transform raw callables into pipelambda when part of a pipeline. All could happen during AST post-processing.

Heavy but not too bad and this could satisfy everything we talked about, no? Split into components (new “partial” and the |> operator) and magic methods (__lcall__, __rcall__) for extensibility, terse syntax for syntactic convenience

Four new tokens `, ``, ~ and |> but a nice syntax without unexpected behaviors and not that much of a cognitive load IMHO.

What do you think?

PS. Better alternatives for backtick could be ! or \ (if we allowed it mid-line and made it mean line continuation only when actually used at line end).

!pd.read_csv(...) |> !!query("A > B") |> !filter(items=["A"]) |> !!to_numpy |> !!flatten |> !!tolist |> !map(lambda x: x + 2) |> list |> np.array |> !!prod

\pd.read_csv(...) |> \\query("A > B") |> \filter(items=["A"]) |> \\to_numpy |> \\flatten |> \\tolist |> \map(lambda x: x + 2) |> list |> np.array |> \\prod

Actually, I like ! / !! quite a lot - they naturally bring a lot of attention where attention is due.

Yeah, I think I agree with you.
And I think having syntactic convenience for partial does not make much sense if partial lives in functools - it is fairly high level import.
So the only way for that would be to implement it at lower level.
And that would be a very long shot…

Yup. Compared to your initial implementation this indeed feels so.
Backticks have faced a significant resistant historically (can’t remember reasons, some might be in Backquotes for deferred expression) - minuscule chance for pulling these through for this fairly niche application.
\ makes it look like latex. Also, long shot. See Python Idea - Latex style macros
! is probably nicest and most reasonable. But also, unlikely that the last reasonable one-character operator would be accepted for this specific application.
Ah and also, in the long run, I am placing my bets on PEP 638 – Syntactic Macros | peps.python.org, which intends to make use of it.

To summarize the mentioned ones and some “new” ideas for:

New “partial” / new lambda:
$, !, ~, ?, , :, , ¬, §,

New “method partial” / new method lambda:
$., $$, !!, ~~, ?, !., ?., ~., □., :, ::, :., ††, †., ¬., ¬¬, §., §§

The placeholder:
$, ~, ?, , ..., :

? has roots in placeholders in SQL queries for example. $ is used Scala for this purpose and in other places like Bash or Javascript formatted strings. We excluded _ I think for obvious and good reasons but it’s used in Swift for this purpose, I believe.

:pd.read_csv(...) |> ::query("A > B") |> ::filter(items=["A"], :) |> ::to_numpy |> ::flatten |> ::tolist |> :map(lambda x: x + 2, :) |> list |> np.array |> ::prod

could be alright. : looks pretty much like a shorthand for lambda : - at least to me it looks convincing. Perhaps :. would be more natural for the method case?

:pd.read_csv(...) |> :.query("A > B") |> :.filter(items=["A"], :) |> :.to_numpy |> :.flatten |> :.tolist |> :map(lambda x: x + 2, :) |> list |> np.array |> :.prod

Any better?

Do *: or **: look particularly bad - I don’t know. Of course the problem with : is its current use. Could be confusing…

Don’t know if the extended ASCII / Unicode characters stand a chance. This feels modern and alright to me, how about the community?

⧖pd.read_csv(...) |> .query("A > B") |> .filter(items=["A"], □) |> .to_numpy |> .flatten |> .tolist |> ⧖map(lambda x: x + 2, □) |> list |> np.array |> .prod

could be fine, where the hourglass symbolizes delayed execution and it’s dropped for the method part since it’s unambiguous if we allow to start such an expression with the dot. □ as placeholder looks quite natural.

We could also use curly braces instead like this:

pd.read_csv{...} |> .query{"A > B"} |> .filter{items=["A"], □} |> .to_numpy |> .flatten |> .tolist |> map{lambda x: x + 2, □} |> list |> np.array |> .prod

We could also keep the placeholder within basic ASCII:

pd.read_csv{...} |> .query{"A > B"} |> .filter{items=["A"], ?} |> .to_numpy |> .flatten |> .tolist |> map{lambda x: x + 2, ?} |> list |> np.array |> .prod

Not too bad IMHO. The {} would symbolize that it’s not a call but rather something to do with data. {} normally stands for a dictionary or a set. Preceded by a primary would symbolize another collection - of a callable and its arguments. This last syntax is growing heavily on me I must say!

So just to step back a bit. Let’s take 2 base cases for this, which are both available without any changes:

Setup
from functools import partial, Placeholder as _

class MethodCaller:
    def __init__(self, arg=None):
        self.arg = arg

    def __getattr__(self, name):
        assert self.arg is None
        return type(self)(name)

    def __call__(self, *args, **kwds):
        arg = self.arg
        assert arg is not None
        if type(arg) is str:
            return type(self)((arg, args, kwds))
        else:
            assert len(arg) == 3 and len(args) == 1 and not kwds
            return getattr(args[0], arg[0])(*arg[1], **arg[2])

MC = MethodCaller()


class P:
    def __init__(self, *args, **kwds):
        self.args = args
        self.kwds = kwds
    def __call__(self, func):
        return partial(func, *self.args, **self.kwds)
    __rmatmul__ = __ror__ = __rrshift__ = __call__

1. rpipe object

class rpipe:
    def __init__(self, obj):
        self.obj = obj
    def __iter__(self):
        return self.obj
    def __or__(self, func):
        if func is type(self):
            return self.obj
        return type(self)(func(self.obj))


rpipe(1) | MC.as_integer_ratio() | sum | format@P(_, '.2f') | rpipe
  1. Infix operator
# Implementation left out as quite cumbersome
1 |x| MC.as_integer_ratio() |x| sum |x| format@P(_, '.2f') |x| x


So there are 2 working solutions which are NOT utterly inconvenient and don’t require anything too sophisticated or unreasonable imports.

So what could be improvements to above and which of those improvements are sensible when seen in the light of how much complexity they add versus marginal convenience?

1. So say, implementation of pipe operator |> would make:

rpipe(1) | MC.as_integer_ratio() | sum | format@P(_, '.2f') | rpipe
1 |x| MC.as_integer_ratio() |x| sum |x| format@P(_, '.2f') |x| x

# INTO NEW:
1 |> MC.as_integer_ratio() |> sum |> format@P(_, '.2f')

As you said, it is not exactly call, so I would say __pipe__ / __rpipe__ or __apply__ / __rapply__ would be more accurate. (also, there aren’t any methods with __l prefix - standard is right with __r prefix and left is without)

2. extending current partial would allow for full functionality that you had in mind

_ = functools.Placeholder
__ = functools.ArgsPlaceholder
___ = functools.KwdsPlaceholder
____ = functools.ArgsKwdsPlaceholder

1 |> foo@P(_)
1 |> foo@P(a=_)
[[1], [1]] |> zip@P(__)
{'a': 1, 'b': 2} |> foo@P(___)
([1, 1], {'a': 1, 'b': 2}) |> foo@P(____)


P.S. can we please take the following 2 as a benchmark case for the following ideas so that we can keep perspective on what we are improving?

rpipe(1) | MC.as_integer_ratio() | sum | format@P(_, '.2f') | rpipe
1 |x| MC.as_integer_ratio() |x| sum |x| format@P(_, '.2f') |x| x

I’d like this as a benchmark:

pd.read_csv{...} |> .query{"A > B"} |> .filter{items=["A"], ?} |> .to_numpy |> .flatten |> .tolist |> map{lambda x: x + 2, ?} |> list |> np.array |> .prod
1 Like

I somewhat get what you’re aiming for, but method chaining like this looks very awkward, precisely because method chaining already works well in python. The natural goal to me would be to design a version of piping that can be interleaved with method chaining.

:pd.read_csv(...) |> :.query("A > B") |> :filter(items=["A"], :) |> :.to_numpy |> :.flatten |> :.tolist |> :map(lambda x: x + 2, :) |> list |> np.array |> :.prod

compare with

(pd.read_csv("my_file")
    .query("A > B")
    .filter(items=["A"])
    .to_numpy().flatten().tolist()
    |> map(lambda x: x + 2, ~)
    |> list
    |> np.array(~)
    .prod()
)

I’m not sure what you intended the .filter term to do, I hope I interpreted it correctly.

There is a lack of clarity there where

|>np.array(~)
.prod()

cannot be converted to

|>np.array
.prod()

Having a benchmark with method chaining in it indeed looks more promising.

I think interleaving as you proposed would be “ill-posed” for the parser. .prod() would evaluate to a “new lambda” and the first part to a numpy array. You would then have this syntax: primary lambda. Should we allow reverse-polish notation in this case?

What you are proposing could be easily achieved by using braces though.