Calling a function that returns Never should not be a type checker error. The canonical example is sys.exit(), which never returns because that’s exactly what it’s supposed to do.
I don’t think @typing_error is really a good solution to your problem, since you’ll have to maintain a growing number of overloads with each NewType you add.
It seems more like what you want is a variant of NewType that does not allow assigning to the original type. I guess one could achieve this by adding a keyword-only flag, let’s call it inherits for the purpose of discussion, to NewType, that indicates whether the type-checker should treat the NewType as a nominal subtype of the original. Of course, the issue with that is that inside your process method, the type-checker would likely complain, so you probably end up casting everything back to ndarray.
class Foo:
def method(self, arg1: "Foo") -> Self: ...
A = NewType("A", Foo)
B = NewType("B", Foo, inherits=False) # like A, but cannot be assigned to Foo
def test(foo: Foo, a: A, b: B) -> None:
a.method(foo) # OK
a.method(a) # OK
b.method(foo) # OK
b.method(b) # illegal, B cannot be assigned to "Foo"
Calling a function that returns Never should not be a type checker error. The canonical example is sys.exit(), which never returns because that’s exactly what it’s supposed to do.
The problem is not calling sys.exit(), the issue that’s been pointed out again and again is that type-checkers are fine with outer_function(sys.exit()), that is passing Never as an argument, which creates a witness of an instance of Never, which is supposedly uninhabited.
Right, and you’d likely want to return an Image, but since “Images are just NumPy arrays” you’d then want them to be able to pass them to functions from NumPy, SciPy, etc.
Although, maybe you could get away with the following (depending on how typing_error would interact with multiple overload matches):
type AnyArrayAlias = Mask | Image | LabeledImage | ....
@overload
def process(arr: Image) -> None: ...
@overload
@typing_error
def process(arr: AnyArrayAlias) -> None: ...
@overload
def process(arr: np.ndarray) -> None: ...
Instead of having to do one overload for each of the bad alias.
After reading PEP 800 – a draft – I initially thought it might be a part of the puzzle. But I’m not sure after spending more time on it. Still noting it here just in case.
PEP 800 proposes the disjoint_base operator, declaring that such bases can never have a common child class.
@disjoint_base
class AnyArray: ...
@disjoint_base
class Image(AnyArray): ...
@disjoint_base
class Mask(AnyArray): ...
def process(x: Image | AnyArray) -> None: ...
process(Image(...)) # should work
process(AnyArray(...)) # should work
process(Mask(...)) # should fail
The PEP doesn’t directly touch on this case. I initially thought, this setup would preclude passing Mask to process. But it seems to me that the disjoint_base decorator only restricts inheritance between such objects. That doesn’t preclude passing Mask since it’s still a child of AnyArray. Right?
This is a misuse of the PEP 800. It should not be applied unless it reflects actual runtime semantics caused by C-level implementation details that aren’t visible to type checkers otherwise. See the discussion threads for that PEP.
This isn’t the intended use case of PEP 800 and I wouldn’t expect it to work. The type Image | AnyArray is equivalent to just AnyArray because Image is a subclass of AnyArray, and Mask remains a subclass of AnyArray too.
I don’t think it’s invalid to use the decorator for its intended use case (restricting inheritance and intersections) even if it’s not enforced at runtime, just as it is legal to use @final to indicate that a class shouldn’t be inherited from even if inheritance is technically possible at runtime.
IIRC the discussion thread explicitly advised against it because the semantics are surprisingly tricky to properly understand and don’t quite match what people would expect. (See this idea here that doesn’t match the semantics or an idea proposed in that thread that want other, different, semantics). Better to just not use it to prevent confusion.
The problem looks like this to me:
class A: pass
class B(A): pass
class C(A): pass
def the(x: A | C) -> None:
pass
the(A()) # OK
the(C()) # OK
the(B()) # Passes but should fail
The last call should fail but passes because B is a subclass of A.
I didn’t look at the definition of Image & Mask (corresponds to B/C here) but from the sample it is likely that they need to inherit from A. If that’s true then I currently don’t know a solution to this, because I think currently it is not possible to inherit from A but not be judged as compatible with A (problem).
It looks the easiest to declare a class A2(A) and force users to change direct use of A to A2, but clearly that’s not okay here.
If I can count on new language features, I think the cleanest solution is let type checkers consider B as not a subtype of A, even though it inherits A, like
from typing import disconnect_with_base
class B(disconnect_with_base(A)): pass # Runtime: disconnect_with_base returns A unchanged
(but if there is def f(x: np.ndarray) and you pass an Image to it it won’t pass type check. If that’s not wanted, then to use this solution a method to convert Image into np.ndarray is needed and users need to use it)
I think of types Just would work for that.
Edit: Link didn’t work, thanks @mfile_bay for noticing, seems like a GH Mobile issue.
Your link is broken, here’s the fixed link: optype/optype/_core/_just.py at master · jorenham/optype · GitHub
Oops, yeah thanks for noticing ![]()
Internally, we would type annotate using Image and Mask, but other libraries may define f as you did above, and Images should, indeed, be passable to those functions.