The number of upvotes shows that there is definitely a demand for this feature. With this approach, it would look like this:
from abc import ABC, abstractmethod
class Foo(ABC):
@property
@abstractmethod
def myattr(self) -> int: ...
class Bar(Foo):
myattr: int = 0
class Baz(Foo):
@property
def myattr(self) -> int:
return 0
There are several differences to having true abstract attributes:
attributes are accessible on the class level: Bar.myattr would return 2, but Baz.myattr would yield <property at 0x....>
properties, by default, have no setter.
properties are lazily evaluated, i.e. Baz().myattr doesn’t exist until it is called.
editors like pycharm/vscode fail to relate the attribute Bar.myattr to the property of the superclass.
Pitch
Add an abstract typing construct, that allows us to express the idea like this
from abc import ABC, abstract
class Foo(ABC):
myattr: abstract[int] # <- subclasses must have an integer attribute named `bar`
class Bar(Foo):
myattr: int = 0
Then instantiating Bar, then at the end of super().__init__ there would be an automatic hasattr(self, "myattr") check that determines whether to raise a TypeError: Cannot instantiate class with abstract attribute myattr``.
I’m +1 on this feature. But, I’d like to share the workaround I’ve been using in the meantime. My workaround is to make the desired “abstract” attribute be a required argument to the abstract parent class’s constructor that the child classes pass in in their constructor’s __super__ call.
from abc import ABC
class Foo(ABC):
myattr: int
def __init__(self, myattr: int):
self.myattr = myattr
class Bar(Foo):
def __init__(self):
super().__init__(myattr=15)
This satisfies the condition that “the child class must provide a value for myattr”.
I’d suggest the name be abstractattribute or abstractattr instead of just abstract to keep with the same naming convention as abstractmethod.
I definitely miss this feature, though my preferred solution would actually be to treat all attributes declared on abstract classes (those inheriting from ABC) as abstract if they’re not set in the __init__ (or if the class doesn’t even have an __init__).
Still, adding a new abstract[] type construct is also completely fine with me.
Does this proposal promise only read access or read/write? I noticed in the example you gave, a child class can satisfy the abstract attribute requirement with a getter property (without a setter). Is this intended? Or does the abstract attribute promise a setter too?
If you don’t promise the setter, then self.myattr = 5 will be a type error. If you do promise the setter, child classes will have to provide it.
Maybe if we’re going this route, it would help to make the getter/setter behavior explicit?
class Foo:
x: abstractgetter[int] # getter promised only
y: abstractattribute[int] # getter and setter promised having type below int
I think it would make sense to allow read-only attributes by default. If a setter is required this should be type hinted separately, like it is done with properties currently. My suggestion is to simply have:
attr: Abstract[ClassVar[int]] ⇝ test hasattr(cls, "attr") post __new__.
attr: Abstract[int] ⇝ test hasattr(self, "attr") post __init__.
The easy way: simply add the checks to ABCMeta, so that they will be called when super().__new__ and super().__init__ are called. (currently, @abstractmethod are checked during super().__new__).
This however has a disadvantage: It required the user to make sure to call super().__new__ and super().__init__ only after adding the methods/attributes. This can be problematic, because sometimes super().__new__ and super().__init__ are required to be called early to set up infrastructure (torch.nn.Module comes to mind).
The ideal way: Perform the checks after__new__/__init__. This would give the user maximum flexibility with how they write custom __new__/__init__. However, I am not sure if it is doable with the current class-creation setup.
I found this other post with the idea of adding a general __post__ hook for class creation . This would be ideal for ABCs. We already have __post_init__ in dataclasses. If there were generic hooks __post_init__ and __post_new__ then it would allow ABCs to perform class / instance validation there, when it is guaranteed that __new__ and __init__ have finished. Packages like pydantic would probably greatly benefit from this.
Personally, I think the checks in ABCMeta are an anachronism. Inheriting from ABC brings in an unwanted metaclass (that’s usually unnecessary—the checks it does could have been provided by an ordinary base class, and the registration it supports is not needed here).
Currently, abstractmethod is checked by type checkers whether you inherit from ABC or not. And I think that’s the main benefit of this feature.
As you rightly point out, there are significant complications to ensuring that a member variable exists—unlike methods, which can be checked in __init_subclass__.
Also, you can’t generally verify the type.
Are runtime checks really that important? What are they usually catching that static checks aren’t?
For example, if an attribute is added dynamically during __init__. But that’s why the type hint stuff is great, because it means you can have the cake and eat it too:
Type hints /decorators only ⇝ static checking
Subclassing ABC ⇝ uses type hints / decorators to automatically generate runtime checks
Generally not, in many simple cases you can. But more crucially I think the runtime-check is about verifying the attribute/method exists at all. For example, if you work a lot in interactive notebooks the runtime checks are pretty useful since the static checking support is limited.
Isn’t the issue with this work-around that there is no way to ensure that implementors actually call super().__init__(...) in their constructors?
There may be ways to enforce it at runtime via metaclass tricks, but in terms of static type checking I’m not aware of any way to enforce it. And than this pattern becomes fairly bug prone, because the type checker will actually believe that the attribute is defined at type checking time, but it may be entirely undefined at runtime.
For these reasons I think the property + abstractmethod approach currently gives better type safety, but the boilerplate is indeed painful. I’d love to see better (type-safe) support for abstract fields!
I’ve had what I think to be best work around right now, you use __init_subclass__ parameterization
I have not tested this with mypy or pylance, i dont have those luxuries on my system but i’d be curious to know how this is handled.
from abc import ABC
from inspect import isabstract
from typing import ClassVar, Optional
class OurABC(ABC):
abstract_classvar: ClassVar[int]
abstract: ClassVar[bool]
def __init_subclass__(cls, *, abstract_classvar: Optional[int] = None, abstract: bool = False):
cls.abstract = True
if not abstract and not isabstract(cls):
cls.abstract = False
if abstract_class is None:
raise Exception # handling of requirement of classvar to be passed as class parameter
cls.abstract_classvar = abstract_classvar
return super().__init_subclass__()
class Concrete(OurABC, abstract_classvar=5):
...