Isn’t that the point – that ABC shouldn’t be one of those “absolutely necessary” cases?
Sure. However within abc.py, the first attempt to construct ABCMeta does not use _py_abc.py (it’s only used as a fallback).
Isn’t that the point – that ABC shouldn’t be one of those “absolutely necessary” cases?
Sure. However within abc.py, the first attempt to construct ABCMeta does not use _py_abc.py (it’s only used as a fallback).
I generally think the ABC method is worse than using protocols (runtime checkable ones, even, if needed), but ABC came first, and comes with runtime enforcement that subclasses implemented the abstract methods. Whether or not deep inheritance chains are a good thing or not isn’t relevant to what we can do now though
I don’t think removing the metaclass will likely be a positive here, mostly because we cannot be sure about impact. I’m not convinced about the overriding of magic actually being a problem given that it can already be done in other ways, so it shouldn’t be a reason not to find a way to remove the metaclass, I’m more concerned about breakage.
With that said, some of that might be me being slightly dismissive of certain design patterns as bad things for libraries to force on their users.
I think it’s also worth pointing out that at least as far as static analysis goes you don’t need to inherit from ABC or use ABCMeta in order to take advantage of abstractmethod. Type checkers[1] will treat any class containing methods decorated with abstractmethod as abstract and will give you the appropriate error if you try to create an instance of that class or a subclass without first writing an implementation in that subclass.
So if you don’t care about runtime or virtual subclasses then you can still use abstractmethod. If you care about getting a runtime error when creating an instance of an abstract class, you could probably separate out just that part into a new common base class.
or at least mypy, doesn’t look like pyright does this ↩︎
I’m guessing from the lack of responses the answer here was “yes this is horrible” ![]()
To be honest, I’m not sure if this is a good idea or not. there’s definitely some things that could be done in line with this which could lead to improving the ability to use a composite metaclass derived from multiple existing metaclasses, but I’d need to give it a lot more thought on if the extra work under the hood is worth it.
But given that I’m pretty sold on composition over inheritance, and even beyond that, not making things which could be functions, methods, I don’t personally run into issues like this frequently and when I have, I’ve just taken the time to reconcile the metaclasses and dealt with it, this hasn’t felt like something that needed to be made more complex or robust. If this is a more common pain point than I’ve experienced, this could very well be worth at least exploring.
I was trying to propose a mechanism that would stop us needing ABCMeta at all, rather than giving a way to compose metaclasses more easily.
It’s not clear that ABCMeta does anything that an ordinary class’s __init_subclass__ couldn’t do. For example,
from abc import _abc_init
class NewABC:
def __init_subclass__(cls, **kwargs):
super().__init_subclass__(cls, **kwargs)
_abc_init(cls)
seems to work, at least according to this exhaustive test case ![]()
from abc import abstract method
class MyAbstract(NewABC):
@abstractmethod
def foo(self):
pass
class MyConcrete(MyAbstract):
def foo(self):
pass
MyAbstract() fails at runtime, as expected, while MyConcrete() succeeds.
Yes, this is what I talked about above, and what this entire topic is about. The only reason ABCMeta exists is to overwrite __instancecheck__/__subclasscheck__ so that virtual subclasses (i.e. does that don’t inherit from the corresponding ABC) can be supported.
The other behavior of preventing instantiation is actually implemented on a C level in object.__new__ when the type flag IS_ABSTRACT is set.
Yep, that’s what I was suggesting could be moved out of the type system entirely rather than moved into the class structure.
Refer back to my previous answer:
It would be very strange if one class defined in the standard library received some special-casing inside isinstance/issubclass. The entire purpose of __instance_check__/__subclass_check__ is to avoid polluting those functions with class-specific behavior. The classes themselves should be in charge of that special behavior, if they require it, everything else is backwards.
Unless you are suggesting virtual subclassing as a concept should be enshrined into the type API and any class can make use of this feature to register their own virtual subclasses.
Oh, I misunderstood that somewhat. That might be possible then, so long as abc.ABCMeta was not removed to allow the old way for those already using it.
That’s what’s not obvious to me: why do these need to be provided by a metaclass other than type? Was the desire simply to not burden “ordinary” types with the virtual-subclass machinery?
Me? Yes, that is exactly what I’m proposing
I think most people would be surprised if isinstance(obj, cls) doesn’t guarantee that obj is an instance of cls. Some special classes that describe something more akin to an interface, like abstract base classes, protocols or TypeDicts ofcourse do violate this, but normal classes most people create should not be able to be virtually subclassed.
Alright, in that case I misread your statement that this system should be specific to ABC. It’s still a little bit bad that you add additional overhead to every isinstance/issubclass call, but it could be justifiable, if that overhead is small enough.
If the old system could be removed, I think it would be the same naive overhead, roughly – a metaclass method lookup versus a global dict lookup. But there’s so much knowledge I’m missing about Python internals, future JIT optimisations…
You can’t really remove __instance_check__ and __subclass_check__, ABCMeta is not the only metaclass that uses it, it’s also used for runtime_checkable Protocol and they use a completely different implementation, because that’s what’s necessary.
I’m talking about additional overhead for all classes, not just ABC, because you’re adding additional code to isinstance/issubclass, not about the fact that the logic may be duplicated between ABCMeta and this new way to handle virtual subclasses.
Pretty sure you could migrate both actually. The differing implementations are appropriately registered at subclass creation time.
I know
Hence why I was talking about “if the old system could be removed”, because looking for those methods in the metaclass is also an overhead for all classes, not just ABC and Protocol.
The point is any metaclass is supposed to be able to use these, not just ABC and Protocol, this is not an internal feature designed purely for Python’s standard library, so you can’t just get rid of it. Also the more special casing you add to isinstance/issubclass the worse the overhead gets, before you just had dynamic dispatch, now you also need to look at the class and determine which special rules you need to follow in addition to the normal subtyping rules, rather than leaving that up to the metaclass, which has all the relevant information.
Understood. It was a big if. You’d have to slowly deprecate the old option, or figure some clever way to copy the metaclass method information into the new dict at class construction time (and maybe that’s not possible because of dynamism?)
I think that was more a problem with the other suggested solution of adding these methods to the class as well as the metaclass. My suggestion adds a single dictionary lookup; any smarts need to happen in the logic that registers classes in that dictionary.
I think the main problem here is I’m not providing any actual code or anything, so stuff is getting lost in translation. I could try to code something up but I’m not seeing a huge consensus that the status quo needs a fix so probably that’s the thing to focus on. Thanks for engaging with my vague attempt to describe an algorithm ![]()