Defining a symbolic syntax for referring to assignment targets

FWIW, aside from the potential performance hit, mutating f_locals should work reliably in 3.13+ due to PEP 667 – Consistent views of namespaces | peps.python.org

1 Like

I actually had thought about that too but the problem was that we can’t use a soft keyword for it since there is no way syntactically to distinguish the soft keyword from a variable name, and if we make it a hard keyword then it would be too much of a breaking change since the keyword would have to be a commonly used English word that likely collides with variable names in existing code base.

But now it occurred to me that we can just make the magic placeholder a dunder so no existing code base should be using it. As for the name itself, I think __target__ would be good:

some.nested.attribute = __target__.lower()

The code now looks a lot less cryptic/Perlish to me.

Behind the scene though, __target__ would be a hard keyword rather than a variable name. It won’t be looked up in the namespace but is rather expanded at compilation time.

Similarlly, @'' can be made into a new hard keyword such as __target_name__ to look less cryptic.

Yup, same here. I almost wrote my own POV comparing this to walrus, but decided not to half-way through realising that it isn’t going to be of much use given this is currently at contemplative stage.

And it took me a fair bit of time to stop abusing walrus, however I am happy about it having finally learned to use it responsibly.

The notation, I think is decent - visually and semantically. And been wandering what was the reason for @ for mat_mult - never seen @ used anywhere in math before. There was quite a good example for this - matlab. Can’t do exactly the same, but retaining * might have been better, e.g. \*. So that @ could be used for something more appropriate, e.g. this. But given status quo, I don’t think it would fly. So 2 options:

  1. Change matrix multiplication operator. Been thinking about how hard of a sell this would be. I know that no-one would want to do it. All the backwards compatibility etc, but has anyone analysed compound cost of a bad decision? And the fact that the earlier the correction is done, the smaller the damage? I am talking in general, not only this specific case. I mean surely if the goal is best possible version of Python, (as opposed to pleasing unsatisfied and reluctant to adapt users), big breaks of backwards compatibility are inevitable…
  2. Pick something else for this. One idea would be to go with bash-like argument syntax, but with pythonic slicing:
a, b, c = {@}
a, b, c = {@[0]}
a, b, c = {@[1:3]}

Or maybe even star, given it being used in expansions:

a, b, c = {*}
a, b, c = {*[0]}
a, b, c = {*[1:3]}

Could also incorporate string-like formatting:

a, b, c = {*!s}
a, b, c = {*[2]!r}
a, b, c = {*[1:3]!u}

Alternatively, use * for strings, @ for values (similar to what bash does).

We have space-separated names as arguments in the standard library too, such as collections.namedtuple and enum.Enum:

ANT, BEE, CAT, DOG = Enum('Animal', 'ANT BEE CAT DOG')

Using my dunder keyword idea above, it can become:

ANT, BEE, CAT, DOG = Enum('Animal', __target_names__)

The matrix multiplication PEP did cover that: PEP 465 – A dedicated infix operator for matrix multiplication | peps.python.org

The __target__ dunder idea is interesting, as it doesn’t provide brevity gains in most cases, but does provide the benefit of avoiding repeated evaluation.

I suspect code that assigns to identifiers would continue to be better off just using the target identifier directly, but attribute access, subscripting, slicing, and tuple assignment could still see some utility.

__target_text__ could replace @'' for both identifier assignment and tuple assignment ( __target_name__ wouldn’t match the latter use case)

I doubt you could reasonably use that form to map function parameters to local variables of the same name, though.

2 Likes

Sounds good indeed. With the name __target_text__ we wouldn’t have to worry about a plural form.

I don’t see why not. We can name the new keyword that forwards local variables to function parameters of the same names, __same__. The compiler with access to AST can clearly tell which keyword argument a ast.Same node belongs to and produces bytecode that loads the local variable of that name for the argument accordingly:

call_with_local_args(some_target=__same__)

This is really a completely separate proposal from the assignment target placeholder proposal though, and should be discussed about in a separate thread (perhaps in the PEP-736 thread, in which I have now made a post).

Yeah the loss of brevity does take away the utility of this idea in cases of a single identifier, particularly as an alternative to PEP-736, where parameter names are always single identifiers.

I agree that we should focus more on its utility for more complex assignment targets.

Going further on the SymPy tangent, I can brainstorm a possible story of how to get that by more straightforward extensions of existing syntax.

The hardest hurdle in this story is the first step. Suppose Python somehow gets dict destructuring assignment:

{'a': a, 'b': b, 'c': c} = {'a': Symbol('a'), 'b': Symbol('b'), 'c': Symbol('c')}

Sympy then defines a helper whose __getitem__ gives out symbols (if it doesn’t already have it):

{'a': a, 'b': b, 'c': c} = Symbols()

By this time PEP 736 has been accepted, and its syntax gets extended to dict destructuring:

{a=, b=, c=} = Symbols()

(Is that Perlish?)
(Is it the direction Python is going?)

People now start writing {Point2D=} = namedtuplemaker('x', 'y')… which isn’t straightforward to read today, but maybe it’s better than with @''?

“Whatever, we have to pick something.”

1 Like

If you’re going to give the target a full identifier, you might as well provide the name rather than hardcoding it

some.nested.attribute as x = x.lower()

This doesn’t require new tokens, and the syntax mimics what we have in the match statement (which tries to overlap assignment syntax). It definitely looks more python than perl (keywords vs symbols)

Same as your proposal, this doesn’t cover argument pushing, but only double evaluation (at this point it isn’t much more than a glorified walrus). This is also missing a syntax to capture the name of the target as a str. I’m not sorry convinced on that but I think you could do (a, b, c) as (_, names) = simpy.symbols(names).

I’m not super sold into this either, but hopefully putting this idea on the table may help someone find a better proposal

5 Likes

For me a big part of this feature was not needing to assign a name for this specific “some.nested.attribute”, so I’m not really sure I like the as proposal

A benefit can be that if the name persists, it can be used more times in the code that follows the assignment.

Quite like it, one issue though:

a, b, c as x = ...
# x = [a, b, c] right?
# Then should be able to do:
a, b, c as (x, *_) = ...
# so that x = a

While naming tuple thing obstruct such intuitive expansion. However might not be an issue as one can always slice it on the right hand side. However, it does not seem very elegant nevertheless. Maybe something like:

(a, b, c) as!n ns as vs = dict(zip(ns, vs))

(a, b, c) as!n names = simpy.symbols(names)

Or:

(a, b, c) as ns!n as vs = dict(zip(ns, vs))

(a, b, c) as names!n = simpy.symbols(names)



Side note

This would be the 2nd method that allows extracting identifier name from identifier. First one being f{a=}. However, it would also be the second one which compulsory does more than that, while simple straight forward way to do this still does not exist, namely something similar to C# nameof, that would do exactly that without needing to evaluate anything. Thinking if I could hack this for it:

(a, b, c, names) as!n ns as vs = vs[:3] + (ns,)
print(names)    # ('a', 'b', 'c')

Interesting idea, for sure.

What I don’t like about it is that it encourages a coding style that binds multiple different values to the same name. Inevitably, one of those values is going to be badly named.

E.g. you might see

sth.this_must_be_lowercase = GetUserInput()
# (1)
sth.this_must_be_lowercase = @.lower()

At point (1), the value of sth.this_must_be_lowercase violates requirements: The value is in fact not guaranteed lowercase.

I would rather it was written such that different values have different names. Like this:

user_input = GetUserInput()
sth.this_must_be_lowercase = user_input.lower()

This is a fairly marginal coding style distinction, and not something I would usually make a fuss about. But I do think the latter style is marginally better, and it would be unfortunate to add syntax that encourages the former style.

4 Likes

I am -1 to using @ for any of the purposes outlined here, but +1 to long_name as x = foo(x) being sugar for long_name = foo('long_name').

It should work the same for dotted names and tuples, i.e. a.b. c as x = foo(x) would be sugar for a.b. c = foo('a.b.c') and (a,*b, c) as x = foo(x) would be sugar for (a,*b, c) = foo('a, *b, c') - note my intentional inconsistent whitespace to demonstrate that whitespace ought to be normalised when generating the string representation.

Outer parens in tuple assignment would be required to make it clearly unambiguous; a, b as x would be a syntax error but might later (as a separate proposal) be allowed in order to let x be 'b'. I’m -0 on allowing this from the beginning; probably better to see how the general idea works out before adding in this complication.

I would prefer that this feature be limited to passing the string representation of some name to a function (or otherwise using it in an expression), as that’s an actually important use case (namedtuple, TypeVar). My dislike of using this for accessing the target object itself is pretty much what @AndersMunch has already explained: it’s far less confusing to just use a regular ol’ assignment for that.

“Python is not Perl” indeed!

IMO another good (also English-specific) mnemonic is “at rhymes with that”, and reading @ as “that” is a pretty natural way to speak the statement.

1 Like

I suspect that I’m in the minority here but I personally found figuring out what assignment target means (i.e. whatever’s on the the left of the = operator) to be more confusing than mapping a punctuation character to the words assignment target. So perlish or not, to me eliminating @ isn’t really fixing anything.

2 Likes

TBH, in this version of the proposal you don’t get much for TypeVar. You are changing:

T = TypeVar("T")

into:

T as t = TypeVar(t)  # is there a better name for "t"?

if you make up a name and copy it over you’re getting more verbosity than the original and still have repetition. Same for namedtuples. That’s whay I didn’t worry too much about the string representation when I covered this, I was more focused on Alyssa’s idea of avoiding re-evaluation of attribute accesses/slicing.

We have __set_name__ so descriptors know their own name. Have we thought of extending __set_name__ to work on objects? Or perhaps there is a reason why it’s not done? I think it would be neat if objects could capture the lhs as a name like in descriptors

class Field:
    def __set_name__(self, owner, name):
        self.__name__ = name

field1 = Field()

class Obj:
    field2 = Field()


print(Obj().field2.__name__)

# "field2"

print(field1.__name__)

# Traceback (most recent call last):
#   File "...", line 12, in <module>
#     print(field1.__name__)
#           ^^^^^^^^^^^^^^
# AttributeError: 'Field' object has no attribute '__name__'. Did you mean: '__ne__'?


1 Like

Allowing __target__ on the LFS would make the syntax ambiguous when there is also __target__ on the RHS:

foo = [0, 1, 2, 3]
__target__[2:] = __target__[:2]

Does __target__[:2] on the RHS become foo[:2] or foo[2:][:2]?

Assignment to a name does not modify an object in-place. It merely sets the name on the LHS to a new reference to the object on the RHS. So __target__ = hex(v) would just set the local name v to a new object for that iteration rather than updating the entry in mydict, which will still hold a reference to the original value.

1 Like

Alyssa’s elaborate description of how to deal with different assignment targets may seem confusing, but at its heart it’s something quite simple and familiar: It’s how augmented assignment works.

Just like a.b[c].d += 1 only evaluates a.b[c] once, a.b[c].d = @ + 1 would also evaluate a.b[c] just once.

1 Like