typing.FirstResolved as mechanism to refer to the first resolved passed typevar

the mypy issue on typevars and optional triggered this idea

In utility helpers dealing with data transformation and/or loading the return value can be a product of multiple input parameters

for example

def get(  
        self,
        key: str,
        default: _D | None = None,
        convert: Callable[[str], _T] | None = None,
    ) -> _D | _T | str | None:
        return self.config.get(self.name, key, convert=convert, default=default)

This needs about 4 overloads to narrow down the types.
In particular as the optional does not transfer to a typevar without at least mypy error

instead i would love to be able to write something like

def get(  
        self,
        key: str,
        default: _D | None = None,
        convert: Callable[[str], _T] | None = None,
    ) -> FirstResolved[_T, str] | FirstResolved[ _D, None]:
        return self.config.get(self.name, key, convert=convert, default=default)

# or even
def get(  
        self,
        key: str,
        default: _D | None = None,
        convert: Callable[[str], _T] | None = str,
    ) ->_T| FirstResolved[ _D, None]:
        return self.config.get(self.name, key, convert=convert, default=default)

I like type hints. But if you’re writing a method that returns a choice of None, a string or two different generics, is the cost/ benefit of using a static type checker in a dynamic language really still worth it?

1 Like

I’m not sure I fully understand the proposal – it looks like this is a way of having a type var pick up the type of a default argument? What happens in a context like a generic class where the type var is already bound; is that just an error?

To me, this looks like a special kind of overload. I wonder if it could be satisfied by improving the ergonomics of overload? I find overload very clunky to use when two or more arguments control the output type. Maybe this proposal is the best way to spell this, but it sure would be nice to be able to write

@overload
def get(default: None, convert: Callable[[str], _T]) -> _T | None: ...

in such cases without being told off for all of the omitted arguments – their types being implicitly whatever the real function definition uses.

I think this is the wrong question to be asking. Assume it’s a 30,000 LOC project which successfully lints with mypy --strict. Now you want to add a helper with these semantics. What do you do?

In these sorts of cases, I have sometimes written more verbose runtime code with multiple different helpers, rather than a stack of 6+ overloads, to express what I would have done “in the old days” with a single function.

Wanting to be able to write the natural function for these use cases with correct type hints makes perfect sense.

2 Likes

In structure/usage the Return types pretty much collapse by input choices, it’s Just that it’s a pain to add a Matrix of overloads if multiple Inputs affected the output

1 Like

Its indeed a clunk overload, as the overloads in that case don’t actually express different behavior but shoehorn Type flow

Fair enough, I know your point is it adds extra overloads, but I prefer the first example, as it’s the most readable. Even though you are arguing against it.

I’m not objecting to your idea by the way, it doesn’t seem like it will break anything or not be backwards compatible (but I don’t have to maintain the language). I don’t think I would never use it myself.

Fair enough. It makes an already type-checked code base like that more robust and adds value there.

Now you want to add a helper with these semantics. What do you do?

I wouldn’t try to solve it with type checking. Use a railroad.

Off the top of my head, if it’s a more complicated return than Maybe (which None or even exceptions can handle perfectly well), I’d define a new class that the helper returns. Either with predicate methods e.g. is_converted, is_default, or maybe an Enum attribute that summarises for the caller what the heck the call to get actually did (and a result attr of course). A dataclass, why not. I wouldn’t even mind a namedtuple.

In my experience, I find helpers like these are being too helpful, where the caller has no idea what they’re getting back, are trying to be too clever. Essential details should not be masked from the user. They should understand those details. What they also help the user to do, is dig a hole for themselves, when something unexpected happens.

If it can return 4 different things, then my code just to call your helper, has to handle 4 different cases.
The library code might be more robust, but that part of my code just got up to 4 times more work to read and reason about. At least let me make writing those 4 cases nice and readable, instead of testing the types of the retval.

That’s what you got wrong,the input signature always reduces it to 2, or 1 type, the caller never has to handle all types,

But anyone reading the overloads had to deal with the lack of passing over types

If you say so. But at the risk of basing my own argument on my own stupidity:

If the implementation is hard to explain, it’s a bad idea.

I looked at your example on github. I like your code with the overloads too. I see that, I instantly understand that’s polymorphism. If you’re going to use static type checking, embrace it I say. Stick to the established norms of statically typed languages.

Is it true to say your suggestion would allow static type checkers to enforce different type checks on return values (and local variable annotations too?) depending on the call signature?

Thing it’s, it’s not polymorphism, it’s Type flow, the code is the same in all case’s,the types just differ

In a language with a monadic option type this would be trivial as it would spell something like

data.get(name).map(convert). unwrap_or(default)

Unfortunately pythons optionals are more clobbered

1 Like

I’ve never heard of Type flow before.

data.get(name).map(convert). unwrap_or(default)

At the risk of stating the obvious, that API’s entirely possible in Python today, if get returns an object with a map method, that in turn returns something with an unwrap_or method. What am I missing?

At the risk of stating the obvious, programming like python was Haskell is typically not feasible in good faith

Isn’t this similar to the PEP for default values of TypeVars?

The default is not quite the same use case

Here I care primarily about passing on resolved types

If the syntax for constructing a new type var allows for a default (I haven’t looked closely at the new syntax yet, or how it would interplay with defaults), would that maybe allow for this?

Something like

# here I show my ignorance of how to declare a type var
# with a default under the new syntax
def get[_T::str, _D::None](
    self,
    key: str,
    default: _D = None,
    convert: Callable[[str], _T] = str
) -> _T | _D: ...

That accurately expresses the type of this helper, but relies on the fact that the default types match the types of the default arguments. It feels very close to the FirstResolved solution – to the point where I’m not fully sure if they differ at all in what they can express, even if they have a theoretical distinction.

that would be incorrect in the sense that _T would always need to be inferred from the convert parameter

and for the default, the default type is also not needed, it can be inferred from the passed variable (however currently triggers errors in mypy due)

ps PEP 696 – Type defaults for TypeVarLikes | peps.python.org

1 Like