PEP 746: TypedMetadata for type checking of PEP 593 Annotated

While I see the mention of raiseing to avoid needing a real implementation, the need to define a function to get use of the feature at all, still seems unfortunate. All of the preexisting cases I have to use this are inside library code that’d basically boil down to:

@dataclass
class Obj(Generic[T]):
    default: T
    parse: Callable[..., T]

    def __supports_type__(self, t: T) -> bool: ...

I’m not sure if I think this makes sense off the top of my head, but at least ergonomically be ideal if one could instead write something like __supports_type__: ClassVar[T], so that it’s explicit it’s a typing-only construction.

This is perhaps less clear or easy to implement, i’m not sure. I might almost think that, similar to __annotations__ being sort-of magically produced due to annotations, it could be nice to have def __supports_type__(self, t: int) produce __supported_type__: ClassVar[int]? that way you could just directly define that instead when opting out of runtime checking?


As far as names go, I personally think including “annotated” in the name is kind of important, given that this feature is only meaningful in relation to an Annotated position item. Something like __valid_with_annotated__ is obviously fairly long, but to me conveys the right idea, whether it’s a function or a bare annotation.

Not that it’s a strong argument in favor of runtime behavior, but defining the function is just raise NotImplementedError. Is that much worse than ?

i’m not so concerned about ... vs raise. Moreso the requirement of a decent amount of syntatical ceremony to define a function for when many useful cases only need the type of the input value of that function.

The absolute minimal amount of information required to enable a type checker to perform the verification here is the __supports_type__: T above. But for sake of enabling extra, fancier runtime checking that’s turned into def __supports_type__(self, t: T) -> bool: raise NotImplementedError().

That’s a fair point. __supports_type__: T is considerably more succinct. I plan on covering this topic at the Typing Meetup on July 17th, would you be able to attend and propose that there?

1 Like

I also think an attribute fits better than a method here.

Seems out of place to include runtime checking functionality here, especially with the name __supports_type__ implying you’re only checking the type, but the parameter actually being the value.

I’ll try to attend the meetup on the 17th as I’m interested in further discussion about it!

Here’s my notes from the meetup, please others feel free to chime in:

  1. There was no discussion or pushback against the static typing proposals of this PEP.
  2. Multiple participants questioned the value of overloads and the ability to return Literal[False]. Without a concrete example where this is useful it seems like it will be hard to justify the added complexity.
  3. If we don’t need overloads and decide to not push forward the runtime aspects of this proposal something like __supports_type__: T may be sufficient. This may also be easier to name.

My conclusion from this is that we should change the proposal to be:

@dataclass
class Gt[T]:
  __supported_annotated_type__: T

Or something along these lines (name still open to feedback).

Personally I am very interested in runtime use cases and hope that we can still prototype some sort of consensus amongst libraries that use metadata at runtime so that we can standardize in a future PEP, be that by allowing __supported_annotated_type__ to be a method, introducing a new method, etc.

3 Likes

@Jelle I know you’re a proponent of runtime uses. How do you feel about the above proposal?

I still like the idea of using a method, so that you can define in one go both where the annotation is allowed statically (through the annotations on the method, which are checked by static type checkers) and what kinds of values are accepted at runtime (through the method implementation). However, it’s clear that consensus is going the other way, so I’m OK with using a simple attribute instead.

As for naming, it would be nice to align with the terminology I introduced in Sign in to GitHub · GitHub in Annotated[T, M1, M2, ...], T is the base expression and M1, M2, … are metadata elements. The name we look for should communicate that the metadata element gets to say whether the base expression makes sense with this metadata element. We also want to make it clear this has something to do with Annotated. Therefore, I’d suggest __supports_annotated_base__.

Narrow-minded dumb question: Now that converter is in the dataclass_transform spec, could the pipeline return a callable whose type is correctly declared/inferred and use that as a converter?

__supports_annotated_base__ sounds good to me!

One thing I will bring up is that if it’s an attribute it might interact poorly with dataclasses. So maybe we need to specify that it can also be defined as a body-less property?

could the pipeline return a callable whose type is correctly declared/inferred and use that as a converter?

That might be interesting for the pipeline specific use case since it can do transformations but I don’t think that’s a general approach for the other things this proposal is aiming to help with.

I would think you could just define it as __supports_annotated_base__: ClassVar[T] to avoid it affecting dataclasses.