I like type hints. But if you’re writing a method that returns a choice of None, a string or two different generics, is the cost/ benefit of using a static type checker in a dynamic language really still worth it?
I’m not sure I fully understand the proposal – it looks like this is a way of having a type var pick up the type of a default argument? What happens in a context like a generic class where the type var is already bound; is that just an error?
To me, this looks like a special kind of overload. I wonder if it could be satisfied by improving the ergonomics of overload? I find overload very clunky to use when two or more arguments control the output type. Maybe this proposal is the best way to spell this, but it sure would be nice to be able to write
I think this is the wrong question to be asking. Assume it’s a 30,000 LOC project which successfully lints with mypy --strict. Now you want to add a helper with these semantics. What do you do?
In these sorts of cases, I have sometimes written more verbose runtime code with multiple different helpers, rather than a stack of 6+ overloads, to express what I would have done “in the old days” with a single function.
Wanting to be able to write the natural function for these use cases with correct type hints makes perfect sense.
In structure/usage the Return types pretty much collapse by input choices, it’s Just that it’s a pain to add a Matrix of overloads if multiple Inputs affected the output
Fair enough, I know your point is it adds extra overloads, but I prefer the first example, as it’s the most readable. Even though you are arguing against it.
I’m not objecting to your idea by the way, it doesn’t seem like it will break anything or not be backwards compatible (but I don’t have to maintain the language). I don’t think I would never use it myself.
Fair enough. It makes an already type-checked code base like that more robust and adds value there.
Now you want to add a helper with these semantics. What do you do?
I wouldn’t try to solve it with type checking. Use a railroad.
Off the top of my head, if it’s a more complicated return than Maybe (which None or even exceptions can handle perfectly well), I’d define a new class that the helper returns. Either with predicate methods e.g. is_converted, is_default, or maybe an Enum attribute that summarises for the caller what the heck the call to get actually did (and a result attr of course). A dataclass, why not. I wouldn’t even mind a namedtuple.
In my experience, I find helpers like these are being too helpful, where the caller has no idea what they’re getting back, are trying to be too clever. Essential details should not be masked from the user. They should understand those details. What they also help the user to do, is dig a hole for themselves, when something unexpected happens.
If it can return 4 different things, then my code just to call your helper, has to handle 4 different cases.
The library code might be more robust, but that part of my code just got up to 4 times more work to read and reason about. At least let me make writing those 4 cases nice and readable, instead of testing the types of the retval.
If you say so. But at the risk of basing my own argument on my own stupidity:
If the implementation is hard to explain, it’s a bad idea.
I looked at your example on github. I like your code with the overloads too. I see that, I instantly understand that’s polymorphism. If you’re going to use static type checking, embrace it I say. Stick to the established norms of statically typed languages.
Is it true to say your suggestion would allow static type checkers to enforce different type checks on return values (and local variable annotations too?) depending on the call signature?
At the risk of stating the obvious, that API’s entirely possible in Python today, if get returns an object with a map method, that in turn returns something with an unwrap_or method. What am I missing?
If the syntax for constructing a new type var allows for a default (I haven’t looked closely at the new syntax yet, or how it would interplay with defaults), would that maybe allow for this?
Something like
# here I show my ignorance of how to declare a type var
# with a default under the new syntax
def get[_T::str, _D::None](
self,
key: str,
default: _D = None,
convert: Callable[[str], _T] = str
) -> _T | _D: ...
That accurately expresses the type of this helper, but relies on the fact that the default types match the types of the default arguments. It feels very close to the FirstResolved solution – to the point where I’m not fully sure if they differ at all in what they can express, even if they have a theoretical distinction.