It’s not about instantiation cost. Argument list evaluation happens on each observation. The cost grows proportionally against the number of “observations”.
In other words, each time you evaluate it, it applies this small penalty on you.
It’s not about instantiation cost. Argument list evaluation happens on each observation. The cost grows proportionally against the number of “observations”.
In other words, each time you evaluate it, it applies this small penalty on you.
In this case, please provide cases, where this needs to happen so many times that it matters.
The way I see it, most of the time it can be cached and this should ideally be an option of the object.
I have a specific problem in mind and specific use cases, but I am a bit unsure what you are trying to solve here. Thus decisions you make seem to be more based on “I have found a cool way to optimize” as opposed to “I know why this optimization is essential”.
To give you an example. I have spend a bit time optimizing partial.__call__
, but I know why:
map(partial(func, **args, **kwds), long_iterable)
partial will potentially be called very many times and this allows new iterator recipes that are competitive, etc.
Why do optimizations that you are so bent on matter so much to sacrifice design and simplicity?
Why should it be False
though?
>>> x = [1,2]
>>> y = x
>>> x is y
True
Why did you change my example? I don’t understand why it would make a difference.
So assignment doesn’t evaluate. OK. So how do I get tmp
to be the value of z, rather than one of these DeferExpr
objects? In other words, if I do want to assign the current value of z to tmp
, how do I do that?
I repeat, you’re not answering my question by pointing me to the documentation that explains the intended semantics. You’re showing me what the prototype does (which I could have found out myself) and claiming (arbitrarily, as far as I can tell) that some of the behaviours are wrong, without giving any justification.
You do realise that you’re completely wasting your time[1] unless you provide the documentation for the semantics, don’t you?
Well, it’s a fun little experiment, but it’ll never be included in the language ↩︎
I’m tired of seeing people say this to you, and frustrated that I need to make the same point, but where is the definition of what an “observation” is???
But if you do that, how do I find out if something is a DeferExpr
? Don’t tell me I never need to know, you can’t be sure of that.
I printed other stuff in the test code. A string helps to signify which line is doing the print.
A helper function DeferExpr.eval()
can ensure its return value is not a DeferExpr (forgive me if its name looks confusing, that can be changed).
But now a 1
doesn’t behave like an int
, because it has an .eval
method. And if I do
class C:
def eval(self):
print("This is C")
c = C()
x = `c`
what does x.eval()
do? If it prints “This is C”, we don’t have a way to evaluate x. If it evaluates x, then x behaves differently than c does.
None of this is new. These issues have been discussed before, and you’re wasting everyone’s time if you don’t have new answers.
An observation is any operation other than direct assignment or argument passing. I am working on making my prototype actually work this way before I write down a solid promise.
Check out this macro “OBSERVE
” for it’s most precise behavior. It works recursively (i.e. if x => DeferExpr, it will observe the return value until a non-deferexpr is returned).
I have a C API in my working draft. I held it back because I am struggling to fix the segfault. Here is how it looks like:
No it does not:
# Tested on local binary
x => 1
x.eval # AttributeError: 'int' object has no attribute 'eval'
It’s only available through DeferExpr.eval()
- NOT DeferExpr(...).eval()
The attribute space of a deferred variable is completely preserved - no magic backdoors whatsoever. That’s why I am so excited and invited everyone to try it out.
P.S. It even works with auto-completion in the REPL - zero lines of code is written in specific to make this happen!
Python 3.14.0a1+ (remotes/origin/feat/defer-expr:6c687e41e0, Nov 10 2024, 05:00:20) [Clang 16.0.0 (clang-1600.0.26.4)] on darwin
Type "help", "copyright", "credits" or "license" for more information.
>>> import time
>>> t => time
>>> t. <TAB KEY>
t.CLOCK_MONOTONIC t.CLOCK_UPTIME_RAW_APPROX t.clock_settime_ns( t.monotonic() t.strftime( t.timezone
t.CLOCK_MONOTONIC_RAW t.altzone t.ctime( t.monotonic_ns() t.strptime( t.tzname
t.CLOCK_MONOTONIC_RAW_APPROX t.asctime( t.daylight t.perf_counter() t.struct_time( t.tzset()
t.CLOCK_PROCESS_CPUTIME_ID t.clock_getres( t.get_clock_info( t.perf_counter_ns() t.thread_time()
t.CLOCK_REALTIME t.clock_gettime( t.gmtime( t.process_time() t.thread_time_ns()
t.CLOCK_THREAD_CPUTIME_ID t.clock_gettime_ns( t.localtime( t.process_time_ns() t.time()
t.CLOCK_UPTIME_RAW t.clock_settime( t.mktime( t.sleep( t.time_ns()
Good idea is to have utility functions. So that the object itself completely mimics wrapped value and back-door access is provided via specialised utility functions. E.g.:
PyObject *
compute(obj) {
return PyCall(obj->func, obj->args, obj->kwds)
}
A lot is still to be solved, but I have to admit, some progress is being made here.
At least in this respect…
Something like this?
def observe(obj: DeferExpr[T] | T) -> T: ...
It turned out to be a bad idea that I borrowed the namespace of DeferExpr
.
Every python user will assume DeferExpr.eval
is available through DeferExpr().eval
- while it is not.
This has been discussed above. Other objections for backquotes include the challenge to use them in markdown code blocks - and I do agree on that. Good news is that we now have 3 different options for the syntax. Please see above for details.
In the meanwhile, focus of discussion has been mostly shifted to the behavior side - what’s unique about the DeferExpr and what benefit we can claim for it?
No. If you think so, you probably missed the point of this entire proposal:
I see no relevance between weakref.proxy
and this proposal. They solve different problems.
In fact, if PEP750 is passed, we probably will lose d-string as one of defer-expr syntax candidates.
DeferExpr is supposed to be parsed immediately by PyAST and converted to byte code. Doing eval(d_str)
upon observation completely breaks function closure and scope. The performance will also be unacceptably poor.
P.S. I can no longer edit the title nor content of my original post. But if given a chance from the moderators, I would like to do the following:
Edit the title so it does not refer to a specific syntax (back quotes). We now have at least 3 options available to choose from.
Remove a bad example: def f(a, b, c => a + b): ...
. It is not supposed to be solved by this proposal.
At the start of the post, add a link to the github fork where I implemented a working prototype - so people can either try it out or collaborate on it.
In the post, list alternative syntaxes for consideration:
In the original post, clarify the definition of an “observation”:
Also, add a note that in the current prototype, some statements will break such promise (e.g. is
and type
).
These edits will help new readers if they do not want (or do not have time) to read through the entire thread. We have 94 posts already and I anticipate more to come.
Because LHS and RHS of is
are supposed to be two independently constructed list objects:
# Equivalent to
>>> lambda x: [1, 2]
>>> y = x
>>> x() is y()
# Should be False because they are separate
# observations on a list construct.
You can take a step back, reconsider the entire proposal, and try again while addressing the questions raised here. I am not clear on what your proposal is about, even though I am familiar with deferred expressions. It’s difficult to follow, and not all cases are discussed, especially edge cases.
That’s exactly why I decided to devote a lot of effort and make a working prototype.
Instead of speculating on “edge cases” or “undefined behaviors”, you can now just name a specific case and challenge this proposal with actual results. I can even run it for you if you would rather not compile it yourself.
Content below is not part of the conversation. I should have posted it separately
P.S.
I am not good at expressing stuff - but I feel terribly bad when someone says that I am “brainstorming” (a.k.a. wasting everyone’s time). To me this is a very, very serious accusation.
I think a piece of working code can explain itself much better than I could ever do.
I should have said ‘review’ instead of ‘reconsider.’ I didn’t mean to discourage. I noticed there has been some brainstorming, so to be fair, starting a new thread might be the best approach. Reading through 100 posts in your free time is not easy—just my suggestion.
The problem is that “working code” can’t distinguish between intended behaviour and “accidental” behaviour. I completely sympathise with the problem of not being able to explain the intended behaviour clearly in words, but it is going to be necessary if you plan on writing a PEP for this, and the sooner you make an initial attempt at writing a spec, the sooner people will be able to start helping you refine the wording.
Thanks for bearing with me! I will try my best to come with a more readable version of this specification.
He’s not the only one who can’t see the point then. You’ve already dismissed your own Late-bound function arguments use case and your non-backdoor approach for typing forward references use case (assuming it can even be done without accidentally forcing an evaluation at some point) doesn’t really improve the syntax (backticks instead of quotes).
So all that’s left is your new approach of declaring zero-argument lambdas which by definition can already be done with lambdas. And, by requiring an explicit ()
to evaluate, lambdas automatically solve all the problems you’d be introducing regarding when evaluation takes place.