Support unchecked iterables as tuple assignment sources

Aha, you have a hidden extra constraint that entries must be both alphabetically ordered and monotonic. You could drop either of those. Indeed, in a lot of cases you must drop one, since you are breaking backwards compatibility by renumbering.

In fact, I don’t understand why you’d want this to be an integer if you’re not serialising it. Why not just use strings? Those would be a lot more readable when debugging or logging.

4 Likes

The solution to that problem is not to add a new feature, requiring users to know both when to use the new feature instead of Enum and how to use Enum when appropriate or necessary.

Because they make for a lot longer code, and it’s not as easy to check they are all different as with integers. But I guess they’re an objective advantage of the Enum with its members’ name accessor (or is it _name_ ? anyway).

I think that to people already preferring to use range() than enum, and as I said I encountered that code in different situations, A, B, C, * = range() will be far easier to learn than enum. But I hear the point that they should just learn to use enum, as I think that’s where Alice’s solution is ultimately heading to.

Also, to dig my own grave, the new syntax would introduce a dissymmetry between *, A, B, C =, A, *, B, C = and A, B, C, * = : the two former ones would have to exhaust the iterable whereas the latter would only consume the first three values of the iterable (potentially iterator).

I like the idea. I’d love the additional possibility of assigning the partially exhausted iterator. Using, for illustrative purposes only, a non-existent (*) syntax it would look like this:

In [1]: a, b, c, (*)rest_iter = range(10)

In [2]: a, b, c, rest_iter
Out[2]: (0, 1, 2, <unpacking_iterator at 0x105896c10>)

In [3]: list(rest_iter), list(rest_iter)
Out[3]: ([3, 4, 5, 6, 7, 8, 9], [])
which would be an equivalent to
In [1]: from itertools import islice

In [2]: range_10 = iter(range(10))

In [3]: a, b, c = islice(range_10, 3)

In [4]: a, b, c, range_10
Out[4]: (0, 1, 2, <range_iterator at 0x103890f60>)

In [5]: list(range_10), list(range_10)
Out[5]: ([3, 4, 5, 6, 7, 8, 9], [])

The magic in the syntax alone can be a deal breaker here.

1 Like

I was thinking this too but once I arrived at the islice equivalent I realized the syntax doesn’t add much.

I think the islice version might be more clear: the original iterator remains the same, whereas in the unpacking-magic version you create an extra redundant variable for the remainder.

An example where I might use this is the previously-mentioned scenario “grab the first few lines of a file, then read the rest”, but I don’t think

header, some_extra_line, *rows = reader
for row in rows:
    ...

is better than

header, some_extra_line = islice(reader, 2)
for row in reader:  # using the same reader again
    ...
1 Like

I am not sure I understand the first line of the OP,

“It would allow the syntax: a,b,c=range(3).”

This is allowed… (I just typed it in to double-check.) So…?
(And yes, to see only the first few terms produed by an infinite generator, which is often desired, I’d also use
[ t for t,_ in zip(mygen(),range(20)) ]. It’s short and easy to write and to read.)
EDIT : now I found “My base goal though was to remove the hardcoded 3…”
OK, then simply: vars().update((n,v) for v,n in enumerate(["abc"])), or replace "abc" with ['first', 'second', 'third'] for longer variable names.
It is slightly longer than the desired syntax but it is very readable, unambiguous and probably more efficient than to use more complicated structures.
EDIT 2: to “not just assign integers, but partially unpack an arbitrary iterable, e.g. lines of a file”
=> just replace enumerate(names) by zip(names,iterable), and (n,v) for v,n in... isn’t needed ay more: vars().update(zip(names, iterable)).

You cannot rely on modifying vars().

Without an argument, vars() acts like locals(). Note, the locals dictionary is only useful for reads since updates to the locals dictionary are ignored.

Thinking about this as “stop consuming the iterable” actually makes me like the proposal more. I agree on the * syntax, but what about a trailing /:

# handle first two items
first, second, / = iterable
# process rest
for item in iterable:
    ...
1 Like

One issue I have is that no matter what symbol you use, it will end up just whitespace away from looking like an augmented assignment operator. Not saying it’s an issue for parsing, but it’s not a great look.

first, second, / = iterable
first, second, /= iterable
first, second, * = iterable
first, second, *= iterable
1 Like

That can be avoided with square brackets if people are bothered by it.

[first, second, *] = iterable
[first, second, /] = iterable

I’d rather be able to slice iterators, personally (implicit islice using slice notation), like iterable[:2].

But I know that has been proposed in the past and wasn’t particularly popular.