Is there any way to unspecified type variables default to their bound type?

Both MyPy and Pyright agree that x.shape has type Any.

import numpy as np

x = np.zeros(10)
reveal_type(x.shape)

This is because numpy.typing.NDArray = numpy.ndarray[Any, ...] where numpy.ndarray is generic on T and U where T is a type variable bound to tuple[int, ...]. The shape of an ndarray has type equal to T. Therefore, when Any is passed for T (as above), the shape has type Any.

I asked Eric about a similar case a long time ago and he clarified the situation: Question/feature request · microsoft/pyright · Discussion #5073 · GitHub

I was wondering if anything can be proposed to make the above revealed type tuple[int, ...? It seems unfortunate that the type is Any, but I don’t see a good solution. Does anyone have any ideas?

Could the NDArray type alias be changed to use tuple[int, ...] as the first type argument rather than Any?

NDArray: TypeAlias = ndarray[tuple[int, ...], dtype[_ScalarType_co]]
1 Like

This probably causes issues with functions that expect ndarray to have a certain dimension and also prevents annotating x with the actual dimension. since tuple[int, ...] is not assignable to e.g. tuple[int, int]. This can also not be easily narrowed through simple assertions, like you could with an actual tuple: mypy Playground

from typing import Generic, TypeVar

T = TypeVar("T", bound=tuple[int, ...])

class Foo(Generic[T]):
    dim: T


x: Foo[tuple[int, ...]]
assert len(x.dim) == 2  # not enough to narrow, we need a TypeIs/TypeGuard
y: Foo[tuple[int, int]] = x

We would need a separate tuple type with gradual length but concrete element type in order for this to work ergonomically. Like it has previously come up in the discussion surrounding whether or not tuple[Any, ...] should be a gradual type in its length or not.

1 Like