This is NOT “late binding”. This is “copy” semantics. They are completely different in behaviour and do not serve the same purposes.
Please can you rename your package and start a completely separate thread to discuss it?
This is NOT “late binding”. This is “copy” semantics. They are completely different in behaviour and do not serve the same purposes.
Please can you rename your package and start a completely separate thread to discuss it?
I agree it is copy semantics. It’s not late evaluation, but it is a form of late binding.
The package now works with dataclass
too.
I don’t understand why the name should be changed (also, it’s too much trouble).
I have what I wanted, which was a way to get rid of the None
protocol and the silly typing annotations it requires.
I think there hasn’t been enough interest as to start a new discussion.
Not really. It still early binds, but then it copies. The semantics you have devised are copy semantics. That is exactly what you have created.
You cannot use this technique to delay a costly calculation, to reread an external resource, to reflect a change to a global variable, or anything else that late binding would achieve. This is ONLY useful for the very narrow and specific case of using a mutable object and wanting independent copies.
This is copying. Not late binding.
I agree. Yet it’s what I needed.
Python has provisions for late evaluation with yield
, context managers, coroutines, asyncio
, but holding an expression in a declaration for later evaluation can only be done with the help of the parser and the interpreter, not with decorators.
EDIT: MyPy accepts the declarations around Late
, and that’s good enough.
And I never said that this was a bad thing to do. Just that it is not at all “late binding”, nor “deferred evaluation”. It is “copying”. This is why I asked you to please call it what it is, not try to pretend that it’s something it isn’t.
There’s a way to have late evaluation with a similar approach to the one I used in Late
:
def param_value():
yield a_complex_expression
@latebinding
def f(x: __(param_value)):
print(x)
The library would have to detect that the argument is a generator, and it seems that can be done with inspect.isgeneratorfunction()
and friends. The library would use next(param.default())
for the value instead of copy.copy()
.
@Rosuav, I made some changes, and now this test passes:
def test_generator():
def fib() -> Iterator[int]:
x, y = 0, 1
while True:
yield x
x, y = y, x + y
@latebinding
def f(x: int = __(fib())) -> int:
return x
assert f() == 0
assert f() == 1
assert f() == 1
assert f() == 2
assert f() == 3
assert f() == 5
It doesn’t pass MyPy, though.
Why are you telling me about that? Is that anything to do with any of the proposals that I have put forward? Is it anything to do with this thread?
The last example does late evaluation through a generator.
But I agree. This discussion is far from over.
Clearly add(x=defer [])
binds x
to defer []
, and is the wrong thing, as you point out. But if one absolutely had to come up with a grand unified proposal that covers both “deferred expressions” and “late-bound arguments”, then I think what we are looking for here would be add(defer x=[])
instead.
That could make sense, but now when you define the semantics for defer X
, you need to support X not just being an expression.
Which basically just would use the word defer
in two subtly different, but quite distinct, ways. There’s no real unification here; the semantics for defer x=[]
are not particularly related to x=defer []
and the use-cases would be quite different.
They are orthogonal proposals.
I’d like to offer up the way Julia handles this–multiple dispatch! If you have a function:
def f(x: Union[int, None]=None):
return do_stuff(x)
Instead, f
can be overloaded and written as:
def f():
return do_stuff(None)
def f(x: int):
return do_stuff(x)
Which introduces a lot more whitespace and clearly separates the two cases.
I can’t think of a solution with more backwards-compatibility issues than “use an entirely different language”
Not wrong. Luckily, Julia can be made backwards-compatible with Python by using ctrl+f
to add end
to the end of every code block
More seriously, there’s already implementations of multimethods in Python packages (including PyTorch, where it’s used extensively).
I’m confused. How is multiple-dispatch a solution to this problem?
The problem with using multiple dispatch to solve default argument values is that the number of declarations may be 2**n
for n
arguments with defaults.
I wrote the code I can write here:
It lets you break up the function definition into separate functions for each case (one for None
and one for the usual case), which lets you simplify the type declaration.
That’s fair. Julia handles the more general case with late binding; my response is only a resolution to this one particular case (or others where there’s only one or two optional arguments).