Inherited config dictionary in a class hierarchy

Hello everybody,

I am having some trouble related to a a class hierarchy in Python.
In this hierarchy, each class holds a dictionary with config parameters.
My goal is to define a method get_config such that an instance of any of these classes returns a merged dictionary containing the config parameters of the calling instance plus all the parameters defined in the super classes.

I hope the following code makes my problem clear:

class A:
    _config = dict(a=1, b=2, c=3)

    def get_config(self):
        """TODO"""


class B(A):
    _config = dict(a=3, d=7)


a = A()
b = B()


assert a.get_config() == dict(a=1, b=2, c=3)
assert b.get_config() == dict(a=3, b=2, c=3, d=7)

I currently have the following solution:

    def get_config(self, cls=None):
        if cls is None:
            cls = type(self)
        print("self is of type", type(self).__name__)
        print(f"{cls.__name__=}")
        super_config = {}
        next_super_cls = cls.mro()[1]

        if next_super_cls is object:
            return cls._config

        super_config = self.get_config(cls=next_super_cls)

        return {**super_config, **type(self)._config}

… but I wonder if there is a more straightforward way to do the same.

I am looking forward to your suggestions!

I did not look at your implementation yet but I do not understand why you want to store the configuration to class variables.

Classes are normally used to create objects and data are normally stored in object attributes. For the hierarchy you would not need to create a new class for every hierarchy level and every new part of the configuration. You would not need to use inheritance. You would just link the configuration objects to set the hierarchy.

Inheritance complicates things and it is usually good to be avoided.


Standard library contains a way to chain dictionaries. That could be what you need: collections.ChainMap

from collections import ChainMap

config_A = dict(a=1, b=2, c=3)
config_B = dict(a=3, d=7)
config = ChainMap(config_B, config_A)

assert config == {'a': 3, 'b': 2, 'c': 3, 'd': 7}

print(config)
ChainMap({'a': 3, 'd': 7}, {'a': 1, 'b': 2, 'c': 3})
1 Like

Thanks for your feedback!
Especially the ChainMap looks very helpful.
I guess my code example does not make it clear why I want to do this specific thing. The reason is that I have an already existing hierarchy of pydantic classes (they are a bit similar to dataclasses but also try to cast their attributes to the specified types) and want to enhance their functionality without requiring someone else who might extend the hierarchy further to write extra boilerplate code. That’s why I try to keep all the logic in the base class A.

This might be a better example:

from pydantic import BaseModel


class A(BaseModel):
    _config = dict(a=1, b=2, c=3)

    x: int = 1
    y: int = 2


class B(A):
    _config = dict(a=3, d=7)

    z: int = 3


class C(A):
    _config = {...}

    a: str = ""


class D(C):
    _config = {...}

    b: float = 1.2

The config of each class is supposed to be the same for all instances of that class, so I think it is ok in this specific case to leave it as class variable

I agree, but unfortunately I cannot control if someone after me might decide to subclass one of the classes A, B, C, D, …

In any case, thank you for your input. I will see if I can find a better solution tomorrow.

Hi gkb,
I suggest this plain code:

class A:
    _config = dict(a=1, b=2, c=3)

    def get_config(self):
        return self._config

class B(A):
    _config = dict(a=3, d=7)
    _config = {**A._config, **_config}

Hello Kazuya,
thanks for your suggestion!
In the case of two classes (and given my constraints) this absolutely makes sense. However, in the case where there are many more subclasses I would like to avoid having to repeat the merging of the current dictionary with the dictionaries of the super classes. That’s why I tried to implement this logic in A.get_config.

I see. I would also try to design it like you if the number of classes is unknown or exceeds…maybe 10 :slightly_smiling_face: In that case, you can use metaclass as follows:

class M(type):
    _config = dict(a=1, b=2, c=3)

    def __new__(cls, name, bases, attr):
        attr['_config'] = {**cls._config, **attr['_config']}
        return type.__new__(cls, name, bases, attr)

class C(metaclass=M):
    _config = dict(a=3, d=7)

which gives,

>>> c = C()
>>> c._config
{'a': 3, 'b': 2, 'c': 3, 'd': 7}
1 Like

Thanks, this is a really nice solution (finally I have a reason to use a metaclass :smile: ).
I modified it a bit so that each class will collect the dictionaries of all of its super classes:

class M(type):
    def __new__(cls, name, bases, attr):
        _config = {}
        for base in bases:
            inherited_conf = getattr(base, "_config")
            _config |= inherited_conf
        _config |= attr["_config"]
        attr["_config"] = _config
        return type.__new__(cls, name, bases, attr)


class A(metaclass=M):
    _config = dict(a=1, b=2, c=3)

    @property
    def config(self):
        return self._config


class B(A):
    _config = dict(a=3, d=7)


class C(B):
    _config = dict(a=5, f=9)


def test_config():
    a = A()
    b = B()
    c = C()

    assert a.config == dict(a=1, b=2, c=3)
    assert b.config == dict(a=3, b=2, c=3, d=7)
    assert c.config == dict(a=5, b=2, c=3, d=7, f=9)

One last update: in my actual application, the class A needs to inherit from an additional class (a pydantic BaseModel). Therefore the metadata approach does not work any more.
That’s why I switched to __init_subclass__ instead:

class M:
    def __init_subclass__(cls):
        base_configs = {}
        for base in cls.__bases__:
            if hasattr(base, "_config"):
                base_configs |= base._config

        cls._config = base_configs | cls._config
1 Like

I’m new to Python, but happened to be working with pydantic quite a bit.

# models.py

class AppConfigEnvModel(BaseSettings):
    APPCONFIG_APP_NAME: str = Field(..., env="APPCONFIG_APP_NAME")
    APPCONFIG_CONFIG_PROFILE: str = Field(..., env="APPCONFIG_CONFIG_PROFILE")

    class Config:
        extra = "ignore"


class AppConfigModel(BaseModel):
    _SM: dict[str, str]
    API_ACTIONS: AppConfigApiActionsModel()
    BRANDS: AppConfigBrandsModel()
    ENV: AppConfigEnvModel()
    SETTINGS: AppConfigSettingsModel()

    class Config:
        extra = "forbid"


# config.py
# just a basic outline of what's going on here

      ac = parameters.get_app_config(
          name=profile,
          environment=environment,
          application=application,
          transform=transform,
          force_fetch=force_fetch,
          max_age=max_age,
          **sdk_options,
      )
      ac = yaml.load(ac, Loader=yaml.Loader)

...


    ac = parameters.get_app_config(
        name=profile,
        environment=environment,
        application=application,
        transform=transform,
        force_fetch="json",
        max_age=max_age,
        **sdk_options,
    )

....

acm: AppConfigModel = AppConfigModel(
    _SM={**secrets_provider.get(), **secrets_provider.get()},
    API_ACTIONS=ac.get("API_ACTIONS", ...),
    BRANDS=ac.get("BRANDS",...),
    SETTINGS=ac.get("SETTINGS", ...),
)

Not quite sure if I’m doing this right but this basically loads a bunch of configuration for env, AWS SSM/Secrets Managers and AWS AppConfig into a model that can be re-used anywhere within the application.

Configuration is huge, hundreds lines of JSON/YAML and what I like about using pydantic is what it provides excellent type hinting, so you can easily find values you need while coding, i.e.

SOME_CONST = get_config().SETTINGS.BLAH.BLAH

The one thing I’m not entirely sure about is whether these models are supposed to be initialized in the base models files or in about helper config library like about (say appconfig.py).

I noticed ENV is loaded all by itself, so I don’t bother doing it here and left it to the BaseSettings model.

Any thoughts/suggestions?

Thank you.