So I’ve been catching up on the thread and I am now leaning towards specific syntax or construct like @sadaszewski as can see how operator could be too limiting or confusing
I think if I was to strip this feature to its conceptual core we are trying to construct an array of partials which are then wrapped themselves in a function which takes an item iterates through each partial passing the item through all of them into its transformed into the final result
We can declare that already using this definition of a function called pipe
def pipe(funcs):
return lambda initial_value: reduce(lambda acc, func: func(acc), funcs, initial_value)
item_pipeline = pipe([
partial(map, lambda a: item.name),
partial(filter, lambda b: “delete red” not in b),
list
])
processed_items = item_pipeline(items)
This itself is enough to satisfy my original motivation and perhaps is a good candidate for functools.
Earlier in the thread someone suggested that they would rather the variable which stores result would be on the right hand side.
It’s obviously more limited but if the pipe object instead returned a context manager class
from functools import reduce
class pipe:
def __init__(self, funcs):
self.funcs = funcs
self.result = None
def __call__(self, initial_value):
"""Allows direct invocation like a function."""
return reduce(lambda acc, func: func(acc), self.funcs, initial_value)
def __enter__(self):
"""Context manager entry — simply return the result."""
return self.result
def __exit__(self, exc_type, exc_val, exc_tb):
"""Context manager exit — nothing special needed."""
pass
def __getitem__(self, initial_value):
"""Allows `with p(6) as result:` syntax."""
self.result = self(initial_value)
return self
p = pipe([
lambda x: x + 2,
lambda x: x * 3,
lambda x: x - 1
])
# As a callable:
print(p(5)) # Output: 20
# As a context manager:
with p(6) as result:
print(result) # Output: 23
So okay perhaps at this point we could say that proposing this simple thing for functools could be enough and a nice little win for people who need this use case
But the problem remains that this only works for functions which only accept the data argument as the last argument
The use of placeholder objects would make this cumbersome and hard to read.
And of course you can wrap using lambdas but I don’t think people would use such a feature e
p = pipe([
partial(str.replace, _, "a", "b"),
lambda x: x.upper(),
partial(lambda a, b: f"{a} ends with {b}", _, "!"),
])
We could replace this with partial literals or operators with special behavioursbut it doesn’t feel like such a proposal would be accepted as this grammar would only really be used for this feature and may over complicate Python
p = pipe([
***map(process, *_)
])
Therefore I agree we do need special gramma so with the caveat I have no idea how hard this would be to do
But what if a pipe was just a special type of lambda ?
Like a lambda the rules of the expression in the body of statement would bea specialised subset focused on declaring a pipeline
This would allow us to reuse existing tokens and only introduce at most one new reserved keyword
What would that look like ?
items = pipe:
map(lambda x: x > 2) |
list()
This code would be same as declaring a lambda function which passes an item to map, then the result of map to list before returning that
Regarding place holders we could reuse the “as” keyword which is already a convention from with statements and the fact lambda style statements have named arguments
items = pipe item:
fancymap(lambda x: x > 2, item) as result |
fancyzip(result, 6)
Note these won’t be intermediate variables available outside of the scope they are just ways of creating a named reference which Python compiler can use to know where to place the result of the previous step and the item from the start of the pipeline
Each name reference can only be referred by the following pipeline step statement (the partical call declared before each l )