While I see the mention of raiseing to avoid needing a real implementation, the need to define a function to get use of the feature at all, still seems unfortunate. All of the preexisting cases I have to use this are inside library code that’d basically boil down to:
@dataclass
class Obj(Generic[T]):
default: T
parse: Callable[..., T]
def __supports_type__(self, t: T) -> bool: ...
I’m not sure if I think this makes sense off the top of my head, but at least ergonomically be ideal if one could instead write something like __supports_type__: ClassVar[T], so that it’s explicit it’s a typing-only construction.
This is perhaps less clear or easy to implement, i’m not sure. I might almost think that, similar to __annotations__ being sort-of magically produced due to annotations, it could be nice to have def __supports_type__(self, t: int) produce __supported_type__: ClassVar[int]? that way you could just directly define that instead when opting out of runtime checking?
As far as names go, I personally think including “annotated” in the name is kind of important, given that this feature is only meaningful in relation to an Annotated position item. Something like __valid_with_annotated__ is obviously fairly long, but to me conveys the right idea, whether it’s a function or a bare annotation.
i’m not so concerned about ... vs raise. Moreso the requirement of a decent amount of syntatical ceremony to define a function for when many useful cases only need the type of the input value of that function.
The absolute minimal amount of information required to enable a type checker to perform the verification here is the __supports_type__: T above. But for sake of enabling extra, fancier runtime checking that’s turned into def __supports_type__(self, t: T) -> bool: raise NotImplementedError().
That’s a fair point. __supports_type__: T is considerably more succinct. I plan on covering this topic at the Typing Meetup on July 17th, would you be able to attend and propose that there?
I also think an attribute fits better than a method here.
Seems out of place to include runtime checking functionality here, especially with the name __supports_type__ implying you’re only checking the type, but the parameter actually being the value.
I’ll try to attend the meetup on the 17th as I’m interested in further discussion about it!
Here’s my notes from the meetup, please others feel free to chime in:
There was no discussion or pushback against the static typing proposals of this PEP.
Multiple participants questioned the value of overloads and the ability to return Literal[False]. Without a concrete example where this is useful it seems like it will be hard to justify the added complexity.
If we don’t need overloads and decide to not push forward the runtime aspects of this proposal something like __supports_type__: T may be sufficient. This may also be easier to name.
My conclusion from this is that we should change the proposal to be:
@dataclass
class Gt[T]:
__supported_annotated_type__: T
Or something along these lines (name still open to feedback).
Personally I am very interested in runtime use cases and hope that we can still prototype some sort of consensus amongst libraries that use metadata at runtime so that we can standardize in a future PEP, be that by allowing __supported_annotated_type__ to be a method, introducing a new method, etc.
I still like the idea of using a method, so that you can define in one go both where the annotation is allowed statically (through the annotations on the method, which are checked by static type checkers) and what kinds of values are accepted at runtime (through the method implementation). However, it’s clear that consensus is going the other way, so I’m OK with using a simple attribute instead.
As for naming, it would be nice to align with the terminology I introduced in Sign in to GitHub · GitHub in Annotated[T, M1, M2, ...], T is the base expression and M1, M2, … are metadata elements. The name we look for should communicate that the metadata element gets to say whether the base expression makes sense with this metadata element. We also want to make it clear this has something to do with Annotated. Therefore, I’d suggest __supports_annotated_base__.
Narrow-minded dumb question: Now that converter is in the dataclass_transform spec, could the pipeline return a callable whose type is correctly declared/inferred and use that as a converter?
One thing I will bring up is that if it’s an attribute it might interact poorly with dataclasses. So maybe we need to specify that it can also be defined as a body-less property?
could the pipeline return a callable whose type is correctly declared/inferred and use that as a converter?
That might be interesting for the pipeline specific use case since it can do transformations but I don’t think that’s a general approach for the other things this proposal is aiming to help with.
This might be a dumb question, but what exactly is the difference between this PEP and intersection types?
The reason I ask, is because this
class Gt[T]:
__supports_type__: ClassVar[SupportsGt[T]]
def __init__(self, value: T) -> None:
self.value = value
type PosInt = Annotated[int, Gt(0)]
could also be written as an intersection type:
type PosInt = int & SupportsGT[Literal[0]]
Which basically means that something is a PosInt if it is both an intand an instance of SupportsGT[Literal[0]].
Perhaps I’m missing something here, but isn’t that what this PEP is also specifying?
Anyway, if this PEP is actually about intersection types, I think it would be cool to be able to use the A & B syntax for it.
It is not about intersection types, but only about type checking Annotated metadata, without an effect on the overall type system. The only new type checker errors that will appear are those that happen while checking the annotations themselves.
This is not correct. For example, -1 is a valid SupportsGT[Literal[0]], but it would not satisfy the Gt(0) predicate.
Ok that makes sense. I can see how this will be especially helpful for packages like Typer.
But from the perspective of static typecheckers, would there be a difference between the two?
BTW, I’m not advocating for turning this PEP into one about intersection types btw; I’m just trying to figure out how I’ll be able to use it in practice
The difference is that an intersection type in a function’s signature would affect how the function can be called; the presence of an Annotated type with PEP 746 metadata would not.
Am i misunderstanding something or does this restriction need to implicitly be relaxed in order for this feature to be useful (at least in a lot of top-of-mind use cases for me)?
Hmm great point, I just tried and can confirm. I had not thought of that.
I know I’ve let this issue linger a bit, in part because it’s not urgent to get it in by any release and I wanted it to collect any more feedback it could. Apologies if it’s been too long.
@erictraut what do you think about this issue with ClassVar, is there a good alternative?
Also would you be able to implement support for the new version of this PEP? I think it’s okay to drop support for the old version if you’d like (assuming you don’t think this ClassVar issue is a dealbreaker)?
Does this mean that ClassVar should be special cased on Foo and ignored when applying the PEP 746 logic? I’m wondering what should happen in this case:
from typing import Annotated, ClassVar
class Int64:
__supports_type__: ClassVar[int]
class Test:
a: Annotated[ClassVar[int], Int64()] # Should be OK
b: Annotated[int, Int64()] # is this OK? `b` is not a `ClassVar`
Using a ClassVar definition for this facility feels pretty hacky to me. There is no assigned value to this class variable at runtime, so this feels like a misuse of the ClassVar type qualifier. Not surprisingly, when you misuse aspects of the type system, you’ll often find that they don’t work well in certain cases. That’s what we’re seeing here with the use of generics.
When you initially posed this problem, I thought there was a need for a runtime component. That’s why I proposed a magic method. But it now sounds like no runtime component is desired — that this is purely a static analysis mechanism. Do I have that correct? If so, we may want to rethink the approach. Maybe we should explore a class decorator mechanism?
I’m also wondering if we should hold off on defining this mechanism until after we make additional progress on clarifying the use of Annotation for static types. Currently, there are no rules in the typing spec about Annotated metadata, so static type checkers cannot generally reason about metadata. This PEP attempts to define the first aspect of annotated metadata that static type checkers can reason about, but it feels very incomplete. I worry that if we proceed with this PEP, we could regret it once we flesh out more aspects of annotated metadata. Maybe we should try to do something that’s more holistic.
I noticed that you updated the PEP to switch from a function to a ClassVar, but you didn’t incorporate feedback about the name of the attribute. It’s still called __supports_type__ in the latest draft. There was pretty strong feedback that this should be changed to a name that includes annotated within it.
I think for now I’m going to remove pyright’s existing provisional support for this PEP. I’d like to see the idea progress further before I reimplement another variant.