Inheriting from extension type and built-in types

I have an extension metatype which handles serialization/deserialization. I instantiate an object of this metatype (which is itself a type) called Serializable, and use that as a base class of my serializable objects.

One of my primitives is a bignum type, and this derives from Serializable and int.

Python looks like:

import _extension
Serializable = _extension.Metatype('Serializable', (), {})
class Bignum(Serializable, int):
    def __str__(self):
        return '0x%x'%self

Lovely. The problem is if I call to_bytes from the C extension, PyObject_CallMethod objects that “Bignum has no attribute ‘to_bytes’”. This is very strange, because if I instantiate a Bignum object in ptpython, I can call to_bytes no problem. Also, I can call getattr(bn, 'to_bytes') and then invoke the resulting callable to retrieve the bytes.

Even weirder: If I add to_bytes = int.to_bytes to the definition of Bignum, everything works!

Note that I am writing to the Stable ABI (which is why I want to call to_bytes rather than just call _PyLong_AsByteArray)

The definition of the metatype is:

    PyType_Slot slots[] = {
      {.slot = Py_tp_base, .pfunc = &PyType_Type},
      {0, NULL}
    };
    PyType_Spec spec = {
      .name = "Metatype",
      .basicsize = (int)(type_header_offset + sizeof(struct foo*)),
      .flags = Py_TPFLAGS_DEFAULT,
      .slots = slots,
    };
    return PyType_FromSpec(&spec);

Any idea what I am doing wrong?
(type_header_offset is basically PyTypeObject.tp_basicsize but because this is using the stable ABI I can’t read that, so I am currently using an egregious hack. I already have a ticket to actually read type.__basicsize__ instead.)

Other relevant information. I hit the problem running on Linux using a locally built version of python 3.10.

The extension needs to work on Python 3.7+, and on 64-bit Windows and Linux (obviously, different builds on Linux/Windows).