PEP 570: Python Positional-Only Parameters

I agree with this. I just think adding syntax for this is detrimental.

Is much more than just 5 constructors: the standard library is full of this. Lots of interfaces implemented by the argument clinic are using this feature. The problem is that lots of them are not documented correctly (this problem is covered in the PEP). Also, there are interfaces that are using this trick even without the argument clinic like MutableMapping.update (check also this bug) or dict.update to name a few. There are multiple examples also in the os module like os.write for naming one.

Also, potentially every C function that has a fallback in Python inside and outside the stdlib would need the *args hack. And this is a subset of the scenarios where positional-only are indeed needed, like in any interface that needs to receive any possible keyword parameter (like Formatter.format).

In the PEP and in this discussion, there are more examples of usage outside the standard library.

Check also @storchaka reference PR for more examples were the standard library could benefit from this and also to be compliant to PEP 399.

1 Like

But do you know why that is the case? Simply because, when Argument Clinic arrived, it was decided to migrate APIs without changing behaviour. And since those APIs did not accept named arguments (largely for accidental reasons), after migrating to AC they still don’t.

Case in point: os.write initially didn’t accept named arguments and, after the conversion to Argument Clinic, it still doesn’t. The original code was:

    if (!PyArg_ParseTuple(args, "iy*:write", &fd, &pbuf))
        return NULL;

and it was simply replaced with the AC equivalent.

This is a larger and different debate and it does not invalidate any existing concern or any existing justification for this. The only thing I wanted to point out is that there are more interfaces that use positional-only parameters, I do not want to imply every single one of them falls into the reasons exposed by this PEP.

One of the problems identifying this use case is that C functions can implement this behaviour very easily without the argument clinic (in the stdlib and outside) so unless is declared in the documentation or in the docstring is difficult to know if a function is using them and if is using them discerning if is for some reason or just accidental.

1 Like

Another interesting point of positional-only parameters vs the *args hack is that having the argument in the signature can be helpful for tab completion, and static analysis engines for testing, coverage, typing, auto-completion, IDEs…etc. Using arbitrary argument lists with *args will hide all this information both from the human and the computer.

2 Likes

Well, if the reasons are accidental, then the need to make it easier to define such APIs in pure Python is unfounded :wink:

You don’t really have to tell me…

1 Like

Even if all the interfaces were accidental (and they are not) the reasons this PEP lists for justifying positional-only arguments are still valid :slight_smile: .

In my humble opinion, we should focus on the different scenarios that appear for justifying positional-only parameters, not counting how many functions or intefaces are already using them (and I think there are more than enough justified cases that are listed already in the PEP or highlighted in previous comments or examples).

This is fine, but then you need to rewrite the rationale of the PEP to justify the idea in its own right and not because they already appear in Argument Clinic documentation. (When it comes to justifying the choice of / it’s fine to cite prior art.)

1 Like

I still think the fact that they already appear in the argument clinic stands and still applies as there are more than enough important examples listed. What I am saying is that the core argument should not be a matter or counting how many of them they are (and as I have said, there are enough IMHO).

This is specifically important (and I start to repeat myself) for being compliant with PEP 399. So I don’t think the rationale needs to be changed in that regard.

Perhaps it should be changed to lead with that rationale, then? (Though I think it’s still debatable - why add restrictions to the pure Python implementation when you could just implement the equivalent functionality in the accelerator?)

Thanks for the suggestion but I don’t think this reason is much more important than the rest of the ones explained in the PEP in a way that justify leading with it. The document list several reasons in the rationale that are all complementary. Some of them involve the argument clinic and others do not.

Because the cases when positional only arguments are beneficial. I think there are several comments already explaining this, there is Serhiy’s PR and the PEP document.

Implementing this behaviour in C is very easy so even without the argument clinic you would need positional-only if you want pure Python fallbacks and compliance with PEP 399.

I guess a decorator-based alternative would look something like:

@positional_only_args
def len(obj):

@positional_only_args(count=1)
def dict(mapping=None, **kwargs):

This does seem like a plausible alternative to me. It’s a bit less terse, but OTOH it’s definitely easier to understand the first time you see it, and easier to document and search for. The PEP has some counter-arguments:

I’m not sure what “asymmetry on how parameter behaviour is declared” means.

It’s true that it would require updating argument clinic, the inspect module, etc. But any change here is going to involve substantial changes across the ecosystem – I don’t think this is a serious distinction between the decorator approach and the / approach.

The biggest change with the decorator approach is that we’d want to teach help() and similar tools (sphinx) to show the full function signature, including the decorator. This would be a change from historical practice, because historically signatures were always 1 line, and now they could be multiple lines. I’m guessing this was a major factor in Argument Clinic going with /, since Argument Clinic was trying to be as minimal and unobtrusive as possible? But now that we’re talking about promoting this to a first-class language feature, we should relax that concern some. And tbh single line signatures are getting seriously crowded, between defaults, *, type hints, and now /. Splitting complex signatures into mulitple lines seems like a win for readability, and python has traditionally valued readability over terseness.

There’s no reason that calling the decorated functions has to be any slower. It would be if we add the constraint that we’re not allowed to change the interpreter, but that constraint would rule out the / notation entirely, so it’s hardly a fair comparison :-). The implementation could be in C in both cases, with the decorator just setting some metadata on the function object to be consumed by the underlying function call machinery.

So the PEP’s arguments against the decorator approach seem very weak to me. Given that this alternative is coming up multiple times in the thread, it would be nice if the PEP could at least expand this section to address it more seriously.

IMO, if this is something that we expect people to use frequently, say in >10% of all function signatures, then the / syntax is better, and if it’s more of a niche feature that’s primarily used to handle the special needs of functions like float and dict, then the decorator approach is better.

2 Likes

The asymmetry is that you declare keyword only arguments using syntax in the language (* marker) and positional only using a decorator. This is a major inconsistency in my opinion and is one of the main reasons we have dismissed that alternative.

Also, using a marker in the signature is an immediate way of expressing that in the documentation, if we use the decorator then you still need some kind of marker for the documented signature that will expose that some arguments are positional only, which in my opinion highlights more this asymmetry.

The decorator approach is possible, certainly, but we think is worse given this reasons so we prefer the “/” for this PEP. How important this asymmetry is will change from person to person but we believe that is an important argument that made us leaving it as a rejected idea.

1 Like

Huh! It would never have even occurred to me to think of that as an inconsistency. To me the two features have totally different use cases and target audiences. The names make them sound more similar than they are.

Also, Python historically has prioritized practical utility and simplicity over conceptual consistency. For example, in PEP 465 I originally proposed @ and @@ operators for matrix multiplication and matrix power, for consistency with * and **. But Guido made me drop @@, because it’s rarely useful and he said consistency alone isn’t a good enough argument for adding new syntax.

If we have a standard notation for this, then we’ll be able to use it in docs. If we use /, we can / in docs, if we use a decorator, we can put the decorator in the docs. It’s not a distinguishing feature IMO. (And I found it striking to read Raymond’s thread, where a bunch of well-known Python experts said they had no idea what the / notation meant… we’ve had it in docs for a while and it hasn’t been very successful so far.)

1 Like

Very interesting point. But I think this may be categorically different as we are talking about how the function signature displays how the function should be called. There is an inmediate link between the function definition and the properties of its parameters, as opposed to some constructive over the function (the decorator). Every property of the parameters is on the signature itself and the marker acts as a separator for which ones are which. I think this case for consistency is more powerful as the “*” marker is already doing the same, so we are not “mirroring” something as in the “@” vs “@@”, we are following the same construction pattern.

1 Like

It can be understood somewhat if you think along the lines of “why did they choose / for this meaning and not something else”, but not very obvious if you look at that symbol for the first time and try to figure out its meaning on your own. Python users are already geared to thinking symbols can have different meanings in different contexts (e.g. : at the end of an if vs : separator in a dict), and it is not immediately obvious IMO that / has a opposite meaning to * because they do in the multiplication/division context.

The “/” had only been properly documented in the past 2 months and it was unique of CPython and users could not try it in the interpreter. All of this explains why users are surprised when they encounter it. Is covered on the PEP document as well

That will be very surprising, as the documentation for the API will include a decorator. If you opt to use the “/” then we are in square one and we have an asymmetry on how the feature is defined (decorator) and how is displayed (“/”).

Also, if you use as decorator you would need to do extra parsing to detect that the signature of the function is changing, or you would need to execute the module to access the metadata injected to the function. This may be even more complicated for closures. Users can rename the decorator or worse, so it may be very cumbersome to obtain that information.

Also, the decorator can be backwards incompatible as we are introducing a built-in that can have a name that is already used by the user.

I’m just saying, consider two ways of writing these docs:

@positional_only_args
class float(x=0.0)
Return a floating point number constructed from a number or string x.

class float(x=0.0, /)
Return a floating point number constructed from a number or string x.

Some people will dislike both, and some people will be happy with both, but it’s hard for me to imagine anyone saying that they think the second one is clear and obvious, but the first one is opaque and confusing :-).

TBH it kinda feels like you’re scrambling for arguments to support a pre-existing conclusion here. Yeah, static analysis tools would have to do a bit more work to examine ast.FunctionDef.decorator_list, and some might decide to use some heuristic to recognize the decorator instead of a doing complete data-flow analysis. That’s all pretty normal for static analysis tools though. Python docs have been neglecting to mention positional-only args for decades, and most people never noticed. I don’t think anything terrible will happen if a small fraction of tools fail to document a small fraction of positional args because someone uses a decorator in some perversely dynamic way.

I don’t know that it should be in __builtins__ – it might make more sense to put it in functools or something. Either way though, adding new names is just about the safest thing we ever do, backwards-compatibility-wise.

2 Likes

I still find it surprising that you need to include a decorator in the function signature to understand how to call the function. This is the asymmetry I was talking before.

Doing the same job with the marker is much more easy for them and is will always correctly report the number of positional only arguments instead of trying to rely on some heuristic. This can be specially important for mypy, taking into account also some cases the Guido mentioned regarding Liskov violations for subclases.

I understand your position. I am not saying that is impossible to use a decorator and I am sure there are ways around the challenges. What I am saying is that we have spent a lot of time playing with different versions and in our opinion the reasons mentioned make us prefer to use the marker. I understand that people may disagree and think the decorator is better, but for us, the kind of drawbacks of this approach are worse than the drawbacks of the marker. The same can be said for the advantages. As with everything regarding a situation with different kind of drawbacks, is comprehensive that different visions can exist around this and I respect them. But or conclusion is to propose the PEP with the marker.