Make abc.ABC a regular class by making __instancecheck__ and __subclasscheck__ class methods


As powerful as metaclasses are in customizing class behaviors, they are also inherently prone to causing conflicts with other metaclasses, as pointed out in PEP-487 in its reasoning to add the __init_subclass__ hook to a class:

Metaclasses are a powerful tool to customize class creation. They have, however, the problem that there is no automatic way to combine metaclasses. If one wants to use two metaclasses for a class, a new metaclass combining those two needs to be created, typically manually.


One of the big issues that makes library authors reluctant to use metaclasses (even when they would be appropriate) is the risk of metaclass conflicts. These occur whenever two unrelated metaclasses are used by the desired parents of a class definition. This risk also makes it very difficult to add a metaclass to a class that has previously been published without one.

Currently the biggest use of a metaclass in the standard library is by far abstract classes with abc.ABC and the deriving classes. And yet, there are countless times when one follows good OOD principles and makes a main class inherit from one or more of those abstract classes to signify certain characteristics of the main class, only to end up causing unexpected metaclass conflicts and having to resort to an ugly manual metaclass merge, if a merge can be done at all (a merge cannot be done if there are methods in both metaclasses that don’t cooperate with others by calling super()).

In the following StackOverflow question, the OP can be seen wondering why a class inheriting from configparser.ConfigPaser and PyQt5.QtGui.QStandardItem causes a metaclass conflict error:

from PyQt5.QtGui import QStandardItem
from configparser import ConfigParser

class FinalClass(ConfigParser, QStandardItem):

As it turns out, this is because as it is currently coded, the parent class of ConfigPaser inherits from, making its metaclass different from that of PyQt5.QtGui.QStandardItem, whose developer chooses to use a metaclass to implement a proxy class behavior.

This is really unfortunate since the risk of a metaclass conflict effectively discourages developers from using abstract classes even when it makes total sense to do so, affecting both code reusability and readability in the long run.

Obviously this is an issue already widely acknowledged as mentioned in PEP-487, with its introduction of __init_subclass__ and __set_name__ a giant leap towards elimination of unnecessary use of metaclasses.

But even with PEP-487, abc.ABC remains as the big elephant standing in the room with its metaclass abc.ABCMeta. Reading the code of abc.ABCMeta, it is apparent that the only reason why this metaclass exists is to include __instancecheck__ and __subclasscheck__ methods in order to support isinstace and issubclass calls with user-registered virtual subclasses, and per the documentation, the two dunder methods are looked up on the metaclass only:

Note that these methods are looked up on the type (metaclass) of a class. They cannot be defined as class methods in the actual class. This is consistent with the lookup of special methods that are called on instances, only in this case the instance is itself a class.

So the solution now becomes clear…

The Proposal

Make __instancecheck__ and __subclasscheck__ actual class methods, such that when isinstance and issubclass are called, these methods are looked up directly on the class object of the second argument itself (in addition to its type, for backward compatibility).

Existing metaclass-powered classes in the standard library, such as abc.ABC and typing.Protocol, should be refactored as regular classes.

Note that back in CPython 3.11, there was a comment in typing._ProtocolMeta that states:

This metaclass is really unfortunate and exists only because of the lack of __instancehook__.

But since CPython 3.12, the wording of that comment has changed to state:

This metaclass is somewhat unfortunate, but is necessary for several reasons…

along with an addition of an overriding __new__ method, which, as far as I can tell, is all about fixing an issue about determining when to raise an exception from trying to mix a protocol with a non-protocol class, and could’ve been done entirely with __init_subclass__ instead.

Backward Compatibility

There should be no backward compatibility issues since __instancecheck__ and __subclasscheck__ currently have no effect when defined in a regular class, so there should be no existing code base with a regular class with such dunders defined.

Performance Impact

Since an additional attribute lookup on the class object needs to be performed to support this proposal, there will be a small performance impact to calls to isinstance and issubclass. But the huge gains in code reusability and readability should be worth the tradeoff.

1 Like

You’re failing to take into account code that uses ABCMeta directly, so there most definitely is a backwards compatibility issue, even if your turn ABCMeta into an alias for type, you’ve now destroyed the ability to distinguish between type and ABCMeta and if you add a dummy subclass of type to remedy that you’re right back where you started.

You could argue that this isn’t a problem, but there’s no real way to know without auditing all existing Python code, which can’t be done.


Good point. We can certainly retain abc.ABCMeta for backward compatibility, while implementing it as such:

class ABCMeta(ABC, type):

I don’t see how you could use a regular class as the base for a metaclass. The only way I can see this working, is to keep the original ABCMeta around and deprecate it with the usual deprecation period and only start using ABC without ABCMeta in the standard library once the deprecation period is over. And even then you could make the argument, that there’s still potential for breakage in third party code that uses ABC.

I think there is definitely some benefit to reducing the need for metaclasses, but it will be much harder to justify removing existing metaclasses and/or changing the metaclasses for existing classes in the standard library.


I don’t’ see why not. ABC is just a mixin class in this case, while ABCMeta still inherits from type.

Quick example:

class ABC:
    def __instancecheck__(cls, other):
        return True

class ABCMeta(ABC, type):

class Bar(metaclass=ABCMeta):

print(Bar.__instancecheck__(1)) # outputs True

Have you actually run this code? The only thing that prints foo is creating ABCMeta, because __init_subclass__ applies to subclasses of ABC, not to classes using it as a metaclass. Bar is not a subclass of ABC.

Oops. Fixed with an actual class method then. Thanks.

Won’t this change make any existing user metaclass with these methods on start reacting differently to isinstance checks? Things will start appearing like they are subclasses of the metaclass itself?

I think perhaps these need a different name if defined directly on a class rather than via the metaclass.

(I’m also curious how multiple inheritance would work?)

1 Like

Very good point, where previously you would’ve gotten a mixed metaclass error and were forced to think about whether the two metaclasses can even be combined in a compatible way without overriding some of the logic in the combined metaclass, you now would get no error at all and end up with a difficult to diagnose bug where it’d either behave like the one base class or the other, depending on the inheritance order.

For that very reason it’s probably a reasonably good idea to keep using metaclasses for classes that behave differently from regular classes and thus don’t compose as easily with other classes that also behave differently in a different potentially non-compatible way, even if the same behavior could be implemented using a regular class.

I don’t think it makes sense for these behaviours to inherit in the general case. I wonder if perhaps moving them outside of the type system completely might be a viable option?

  • Add a registration system that takes a type and an isinstance/issubclass implementation
  • Store these in a weak-key map (or something)
  • Change ABC to register its implementation when a direct subclass is created (i.e. in __init_subclass__)
  • If this implementation is wrong, it can simply be replaced by the class itself
  • The global isinstance/issubclass implementation can fall back to checking the metaclass if a registration hasn’t been performed, for backwards compatibility.
  • ABC can also check the metaclass to see if it contains a conflicting definition of the legacy methods

Is this obviously broken? I’m pretty sure there’s some migration issues I’ve missed but is the idea itself broken out of the gate?

This just smells like a God Class, or at the very least is a very good example of when to favour composition over inheritance.

Otherwise why does the GUI framework need to know implementation details about parsing .ini files, or why does the custom config parser need to know implementation details about the GUI framework it lives in, which are so important this special parser can then only be used with PyQT5?

What’s the justification to break Separation of Concerns, over parsing .ini files of all things, to have a thing that a) is “a standard QT base item thingy”, and b) is also “a thing that parses .ini files”? Instead of having a different thing, that owns one of each of those two things, and in which the config parser could easily be swapped out, e.g. for a json parser or a toml parser etc., and so anything special about the config parser can also be used in any other GUI framework?


Much more than just the __new__ method was added to _ProtocolMeta in Python 3.12 relative to Python 3.11. The addition of the __new__ method to _ProtocolMeta specifically, however, had nothing to do with the issue you linked to – it was added in, which was linked to `typing.Protocol` implementation means that `isinstance([],` can sometimes evaluate to `True` · Issue #105280 · python/cpython · GitHub. I think you’ll find that the tests that were added in that commit will fail if you rip out _ProtocolMeta.__new__ and put that logic in __init_subclass__.


Indeed that was such a terrible pick on my part out of the tens of related questions on SO.

A much better example can be seen here, where all that the user wants to do is to define an abstract method for a base Qt widget class, but runs into a metaclass conflict and has to solve it in a counter-intuitive way:

from abc import ABC, abstractmethod

from PySide6.QtWidgets import QApplication, QWidget

class MyBaseWidget(QWidget, ABC):

    def __init__(self, parent=None):
    def foo(self):

Also a good example here, where all that the user wants to do is to define an abstract class for an Enum class, but runs into a metaclass conflict:

import abc
from enum import Enum

class MyFirstClass(abc.ABC):
    def func(self):

class MySecondClass(Enum, MyFirstClass):
    VALUE_1 = 0
    VALUE_2 = 1
    def func(self):
        return 42

The main point of this proposal is to free abc.ABC (perfect for mixins) from the limitations of a metaclass so users can mix it with classes that do have justifiable reasons to use metaclasses (such as Enum) and produce more intuitive class hierarchies that fit their needs.

Sure. But again, the whole point of Enums is that they’re unusual. FYI if you read the documentation, subclassing Enums has special requirements.

It’s a super useful library, but just because PyQT it has overused meta classes for whatever reason (or especially if it has never intended to support Multiple Inheritance), does not necessitate changes to the core Python language IMHO.

Your arguments do make an excellent point. I just think the conclusion should be to avoid using meta classes, unless absolutely necessary.

Isn’t that the point – that ABC shouldn’t be one of those “absolutely necessary” cases?

1 Like

Thanks. Beat me to it. The point of this proposal is to try reducing the circumstances where using a metaclass is absolutely necessary. Since most people use abc.ABC for its abstract methods, I feel that it doesn’t justify its use of a metaclass just so that it can support virtual subclasses, hence the tradeoff to make __isinstancecheck__ and __subclasscheck__ class methods.

If the primary reason for ABC to have a metaclass is to allow having a registry of virtual subclasses, wouldn’t it be an option to deprecate this and have a separate ABCWithVirtual class that keeps the current behavior (and metaclass)? I think that most users of ABC are perfectly happy without having the ability to register virtual subclasses. This would remove the need for the two magic methods.

That would be great, but it would also create backward compatibility issues, although I’m not sure if it’s easy to gauge how much code out there would be actually affected by deprecating virtual subclass registration from abc.ABC.

I don’t think the implementation of isinstance/issubclass should care about the existence of specific special classes, even if those happen to be part of the standard library, if it can’t be done with __instancecheck__/__subclasscheck__ in a predictable manner, then it shouldn’t be done that way.

So we’re back to the dangers of inheriting from classes that define conflicting __instancecheck__/__subclasscheck__ methods, at least with metaclasses you have to first mix the metaclasses before things can go wrong.

1 Like

That’s just the status quo for all mixin classes today. Typically one would pick mixin classes that define methods that other base classes don’t define or are fine being overridden.

At least without a metaclass one can define abstract methods for their mixin classes more freely, with the chances of their mixin classes also defining __instancecheck__/__subclasscheck__ methods much lower than having the other base class using a metaclass.