Yes, and that’s exactly what makes it so tricky. Instead of an AST parse like I was doing, this would have to be a full static analysis of the entire stdlib at once - kinda on par with doing type checking, but for names.
I had to get used to it a little but I think I really like it. It makes function calls more compact and reduces visual noise – you can absorb information about the function call more quickly. (Sometimes it looks a little weird though when in a function call only a single, short keyword argument is converted to the new syntax, but maybe I could get used to that too.)
Especially if you use black for formatting, function calls can take up a lot of vertical space on my screen, and with this new syntax more of them would fit in the horizontal configuration. (Should my functions have fewer parameters? Maybe, though even if it’s only 4 parameters, that’s 6 lines with black.)
To the people saying this is just equivalent to positional arguments, I would respond that this syntax still helps prevent mix-ups in the argument order, which might not be directly noticeable when two arguments have the same type.
This is very interesting, thank you for doing this. In my totally subjective opinion, it feels like the use of something like an ellipsis or a designated primitive type would make examples here feel much more “natural”.
If we take the argparse.py
example:
super().__init__(
option_strings=_option_strings,
dest=dest,
nargs=0,
default=default,
type=type,
choices=choices,
required=required,
help=help,
metavar=metavar)
This kind of feels off and un-Pythonic (can’t articulate why, perhaps just because of unfamiliarity):
super().__init__(
option_strings=_option_strings,
dest=,
nargs=0,
default=,
type=,
choices=,
required=,
help=,
metavar=)
Whereas my gut has no qualms about something like this:
super().__init__(
option_strings=_option_strings,
dest=...,
nargs=0,
default=...,
type=...,
choices=...,
required=...,
help=...,
metavar=...)
Am I alone in thinking like this here?
I’ll note, in case anyone is reading this and is unclear about what kind of API redesign this could mean, that I have frequently over the years refactored things with many arguments to take lightweight objects like dataclasses, named tuples, attrs classes, etc.
So this:
def foo(x, y, z): ...
becomes
def foo(r3_point): ...
And I mostly prefer this kind of API where applicable (also makes foo
easier to extend to take 2D points after it was written to expect 3 dimensions, etc).
But it’s not always obvious that this will be useful, clumsily applying it gives you too many types to manage, refactoring is not always cheap (e.g., stdlib), etc etc. So we end up with lots of keyword args sometimes. I’m mostly of the -0 camp that says “yes, and?” I get that it’s an improvement in some cases – I’m just not easily convinced that it’s enough of an improvement enough of the time.
I don’t have a ton to add to this thread beyond what I shared above about not loving how this feature has applied in Ruby. I’ll try to remain optimistic that if added in Python it will work out better though; hope springs eternal!
Probably. Because that already has meaning - it is passing the value Ellipsis as each of those arguments.
Maybe. But even a currently meaningless explicit somehow feels “nicer” to me, just because it is explicit:
super().__init__(
option_strings=_option_strings,
dest=pass,
nargs=0,
default=pass,
type=pass,
choices=pass,
required=pass,
help=pass,
metavar=pass)
pass
feels wrong. Its current meaning of “explicitly do nothing” is at odds with the “do something” being communicated here.
Here’s a variation on that theme:
super().__init__(
option_strings=_option_strings,
nargs=0,
pass
dest,
default,
type,
choices,
required,
help,
metavar)
Once again, we have this lovely use of “explicit” to mean “something that I like”. Having a word in there doesn’t change the meaning in any way - there is no ambiguity even without it. Having a word might make it aesthetically more pleasing (though not to me personally), but it’s hardly a matter of explicitness vs implicitness
Do you have a link to a ruby project using this pattern? I would be interested to see how it is used “in the wild”.
FWIW, looking at the draft PR on scikit-learn (Add punning by joshuabambrick · Pull Request #27680 · scikit-learn/scikit-learn · GitHub) I’m personally not a big fan. It looks like typos, with black
formatting it doesn’t really reduce the number of lines, and in most cases we don’t type those variable names since IDEs autofill a lot of it.
I find this proposal a bit more readable:
super().__init__(
option_strings=_option_strings,
dest=...,
nargs=0,
default=...,
type=...,
choices=...,
required=...,
help=...,
metavar=...)
or a variation like:
super().__init__(
option_strings=_option_strings,
auto dest,
nargs=0,
auto default,
auto type,
auto choices,
auto required,
auto help,
auto metavar)
It’s good practice to replace long argument lists with an object and less arguments. A long set of keyword arguments usually refers to options that affect the behavior of the function, and not data, and those options can be contained in a Config
object.
Because [TatSu] is a parser generator it has numerous options that affect the parsing. Long lists of keyword arguments plus ``**kwargswere all over the place until I recently introduce a
ParserConfig` dataclass which allows methods to use or change only what they want without incurring in long and complicated argument lists.
I also added this protocol to ParserConfig
to make overriding options explicit and easy:
def replace(self, **settings: Any) -> ParserConfig:
overrides = self._find_common(**settings)
return dataclasses.replace(self, **overrides)
def merge(self, **settings: Any) -> ParserConfig:
overrides = self._find_common(**settings)
overrides = {
name: value for name, value in overrides.items()
if getattr(self, name, None) is None
}
return self.replace(**overrides)
auto
is an interesting proposal, though it might be confusing given that enum.auto
exists.
@tmk, I’m not 100% clear which thing you’re looking for in an example, but I think you mean “variable lifting” in Ruby? Let me know if I’ve misunderstood.
I don’t have any FOSS projects, unfortunately, but I’ll provide a sanitized example from my closed-source work.
example from a real (closed source) Ruby codebase
This is part of menu building for a static site generator.
I’ve tried to remove any $COMPANY-specific details.
Feel free to DM critiques of my Ruby to me, but not in this already-long thread.
def render_menu(
item_descriptors, depth: 0, maxdepth: 1, collapsible: false
)
# render each item in the menu
rendered_items = item_descriptors.map do |item_desc|
# do we have subsections to render or not? (only if there is depth left)
has_subsections = (
(maxdepth - depth).positive? &&
item_desc[:subsections] && item_desc[:subsections].length.positive?
)
# create the link to the current item
rendered_item = link_to(
item_desc[:item][:short_title], relative_path_to(item_desc[:item])
)
# combine with the subsections if there are any to render
if has_subsections
rendered_item = html_tag(
'div',
%W[
#{rendered_item}
<button class="caret"
aria-label="expand/collapse #{item_desc[:item][:short_title]} submenu">
<span class="caret"></span>
</button>
].join(' '),
{
class: 'sidebar-heading',
role: 'navigation',
'aria-haspopup': 'true',
'aria-expanded': 'false'
}
)
rendered_item += render_menu(
item_desc[:subsections],
depth: depth + 1,
maxdepth:,
collapsible: true
)
html_tag('li', rendered_item, class: 'sidebar-submenu')
else
html_tag('li', rendered_item, class: 'sidebar-leaf-item')
end
end.join
tree_class = "sidebar-tree sidebar-tree-l#{depth}" + (
collapsible ? ' sidebar-tree-collapsible' : ''
)
html_tag('ul', rendered_items, class: tree_class)
end
Note how that use of maxdepth:,
does very little to simplify the code or meaningfully improve readability.
My take is that the feature isn’t really bad, but I’m not seeing a lot of benefit from it and I hate that Rubocop more or less strong-arms me into writing it this way.
I’m trying to tread a fine line and avoid spreading any irrational fear about the feature while still giving voice to my misgivings.
If the goal of the feature is to improve readability, then it needs to decide whether or not
render_menu(
item_desc.subsections,
depth=depth + 1,
maxdepth=,
collapsible=True
)
is a readability improvement.
I would appreciate a few examples, in any PEP for this, which show when it is not appropriate. I think that would be enough to guide people away from overzealously trying to enforce this usage on everyone.
I was literally about to suggest the same thing This is what I had in my draft:
Joshua’s example has generated a lot of discussion. I think it’s clear that this feature (like a lot of features) is maybe not appropriate in every situation that it could be used.
One of the things I think the proponents of the feature should try to pin down is in which situations this feature is most appropriate along with some real world examples. This might be good motivation for the feature.
But I’m not sure it’s inappropriate in any case, so I don’t know.
One thing that might help readability would be for abbreviated keyword arguments to be forced to follow ordinary keword arguments? Thus, you would have to have:
This might make them easier to notice.
Yeah, I noticed this as well when sharing the example – it could be “solved” with this arrangement / ordering. But I think that’s a limited solution; and I’ll push on it below.
depth
and maxdepth
have some natural relationship, and they’ve now been spaced out. You can “solve” that secondary problem by doing more reordering to make them adjacent, but that only works if there’s one pair of parameters which naturally “go together”. With just two, it doesn’t work:
render_menu(
item_desc.subsections,
collapsible=True,
depth=depth + 1,
maxdepth=,
collapsed_section_style=,
)
I would probably still call the version with variable-lifting at the end of the argument list more readable than the one with it interspersed:
render_menu(
item_desc.subsections,
collapsible=True,
collapsed_section_style=,
depth=depth + 1,
maxdepth=,
)
but the most readable version, IMO, is
render_menu(
item_desc.subsections,
collapsible=True,
collapsed_section_style=collapsed_section_style,
depth=depth + 1,
maxdepth=maxdepth,
)
This is all preference/style. I accept it as a very reasonable position that the second-to-last example here is more readable than the last.
But I think this example argues against requiring the name-lifting keyword args to come after the ordinary ones.
But I’m not sure it’s inappropriate in any case, so I don’t know.
I have difficulty coming up with concise examples where it’s “clearly inappropriate” other than my real-life Ruby one, and even there the inappropriateness comes from how that codebase is managed rather than anything innate to the code.
I would prefer for that codebase not to use the feature because it is used by a pretty large team with various levels of expertise in the language. Avoiding rarely-seen features benefits me because I spend less time explaining extraneous details.
Really, I would disrecommend the feature in any codebase which has made beginner friendliness a stated goal. But I’d also disrecommend the walrus operator, which is a lovely feature, and I might also disrecommend features as innocuous as set literals.
Joshua’s example has generated a lot of discussion. I think it’s clear that this feature (like a lot of features) is maybe not appropriate in every situation that it could be used.
I guess that explains my initial intuitive objection to this proposal. I wonder how the example @joshuabambrick have provided would have looked like had it been applied only to particularly long variable names (say, 10+ character long). So say if something like this:
super().__init__(
option_strings=option_strings,
dest=dest,
nargs=0,
default=default,
type=type,
choices=choices,
required=required,
help=help,
metavar=metavar)
Would only get a small revision:
super().__init__(
option_strings=,
dest=dest,
nargs=0,
default=default,
type=type,
choices=choices,
required=required,
help=help,
metavar=metavar)
The latter version of the code personally strikes me as the most readable.
I wonder in how many cases this kind of pattern could be resolved using functools.partial
, as I usually see it when forwarding the parameters to another function(s):
def foo(*, x): ...
def bar(*, y): ...
# Forwarding parameters
def foobar(x, y):
foo(x=x)
bar(y=y)
foobar(x=1, y=2)
# Requiring callables
def foobar2(foo, bar):
foo()
bar()
foobar2(
partial(foo, x=1),
partial(bar, y=1),
)
The second option is certainly uglier, but also more extensible if someone wanted to change the foo
behaviour inside foobar
).
If we were to add syntactic sugar, I would prefer to add it for partial
and Callable
type annotations (the rejected PEP 677).
I don’t believe any syntactic sugar is necessary. If a positional parameter with the same name as the variable passed does not exist, the variable becomes a keyword argument; otherwise, it is a positional argument.
9 posts were split to a new topic: Assign corresponding parameter value when omitting RHS in init