Design by Contract in Python: proposal for native class invariants

Hi all,

I’d like to start a discussion around introducing Design by Contract (DbC) into Python, specifically through native support for class invariants.

What I’m proposing

The idea is to allow classes to define an __invariant__() method that is automatically checked:

  • Before and after every public method call
  • Without needing decorators, metaclasses, or manual calls
  • Ideally in a way that’s opt-in to avoid runtime overhead for all classes

This would make it possible to build self-verifying classes that enforce consistency at runtime—especially useful in stateful systems, simulations, or anywhere where correctness matters.

Why this matters

Python provides flexible tools for enforcing invariants, but it lacks a built-in, standardized mechanism for ensuring object consistency by construction—such as automatic validation of class invariants before and after method execution. Class invariants are a common concept in formal methods and have been used successfully in languages like Eiffel, D, Ada, and even early .NET experiments with Code Contracts (and Spec#).

Today in Python, trying to enforce invariants is awkward:

  • You have to call your invariant method manually inside every method.
  • Or use fragile tricks like setattr, decorators, or metaclasses.
  • These approaches are verbose, error-prone, and often slow.

A built-in mechanism would allow consistent, clean enforcement of class-level contracts in a Pythonic way.

Possible syntax / semantics

This could work like:

class Account:
    __dbc__ = True  # Opt-in to DbC

    def __init__(self):
        self.balance = 0

    def deposit(self, amount):
        self.balance += amount

    def withdraw(self, amount):
        self.balance -= amount

    def __invariant__(self):
        assert self.balance >= 0, "Balance must never be negative"

When enabled, Python would automatically call __invariant__() before and after any public method.

Alternatives:

  • Use a class decorator like @contract
  • Only enforce in debug mode / debug
  • Let users enable it per-class or globally

Implementation status

I attempted to build this as a C extension (Invariant-Python) that hooks into CPython internals (e.g., PyObject_Call). Unfortunately, Python’s object model and dynamic dispatch made this more complex than expected, and I hit a wall. So I came up with a rather basic implementation: Invariant Python purely to illustrate the idea.

What I’d love feedback on

  • Would this idea make sense in the Python language itself?
  • Are there internal hooks that could make this feasible without major overhead?
  • Any interest in helping prototype it or shaping a possible PEP?

Thanks for reading! Happy to elaborate further or adjust direction based on your input.

— Andrea

2 Likes

Contract programming in D
Contract programming in Eiffel

Could you please clarify how specific is your “DbC invariants” proposal. Are other important parts of the DbC, namely pre- and post-conditions also in the scope?

Thanks for the question—great point.

Right now, my proposal focuses specifically on class invariants, for a couple of reasons:

  1. Invariants are the least intrusive and most structural part of Design by Contract. They don’t require changing method signatures or wrapping each method individually—they operate at the object level.
  2. I see them as a gateway to broader DbC—if the community is open to the idea of native support for invariants, it could lay the groundwork for exploring pre/postconditions in a future, more comprehensive proposal.

That said, I definitely believe in the value of preconditions (require) and postconditions (ensure). If there’s interest, I’d be happy to explore a roadmap where:

  • Invariants come first
  • Preconditions and postconditions could follow, perhaps with opt-in syntax (@contract, __pre__, __post__, etc.)

I suppose for now, the goal is to keep the scope minimal and adoption friction low. Start simple, prove the value, and build from there.

Some links:

1 Like

Thanks for sharing both PEP 316 and icontract—they’re definitely part of the bigger picture here.

PEP 316 had some great ideas, but it tried to cover a lot all at once—preconditions, postconditions, invariants, even loop invariants—all via docstrings. I think that scope made it hard to gain traction. What I’m proposing is much narrower and simpler: just native support for class invariants, nothing more (for now).

As for icontract: I really respect the work behind it. It’s probably one of the cleanest decorator-based libraries out there. But at the end of the day, it still relies on decorators and runtime wrapping. That means you have to remember to use it everywhere, and there’s nothing in the language itself to enforce consistency. It works, but it’s fragile.

I see both PEP 316 and icontract as signs that people want this kind of thing—but also that we’ve never had a low-friction, built-in way to do it in Python. That’s the gap I’m hoping to close.

There’s a lot about this that’s appealing. But throwing an assertion error at runtime doesn’t seem that great to me for production code.

Would it add too much complexity to add a way for client code, and users of libraries to turn this off?

Could there be a way to create a draft or prototype instance, and test the invariant on that after a method call, and if the invariant fails handle that, but leave the original instance alone?

Does having each method return a new immutable instance work better in all circumstances (the invariant can then be tested in init or in a post init method for a dataclass).

1 Like

Thanks for expanding on that.

For production, tying invariants to __debug__ is the cleanest approach—like Python’s built-in assert. It would let you enforce invariants during development, while avoiding runtime assertion errors in production, unless explicitly enabled.

The idea of “draft” instances is a creative one, but testing invariants on temporary objects might lead to inconsistencies, especially with mutable state. Class-level invariants offer more reliable enforcement across an object’s lifecycle.

Immutability and invariants are complementary, and in scenarios with mutable state, invariants help ensure consistency without having to check every mutation.

Thanks for the thoughtful input!

1 Like

Not sure why you put metaclasses in the category of fragile tricks, for me this is the perfect use case for a metaclass.

I do wonder if there is a real benefit on adding this to the language as it may slow down (a tiny) bit the class creation even if you put debug to not slow down the methods. The thing is that the implementation with meta-classes is kind of straight forward.

I illustrate here a basic example of how to do a naive metaclass implementation of what you ask:

from functools import wraps

def invariant(func):
    @wraps(func)
    def wrapper(*args, **kwargs):
        print(f"Calling invariant  {func.__qualname__}") # no need
        # if __debug__: maybe ? as wish
        args[0].__invariant__()  # after before or both as need 
        ret = func(*args, **kwargs)
        args[0].__invariant__()  # after before or both as need 
        return ret
    return wrapper

class MetaContract(type):
    def __new__(cls, name, bases, dct):
        cls_obj = super().__new__(cls, name, bases, dct)
        if "__invariant__" in dct:  # so if you don't have __invariant__ dont fail or spend resources on it
            for attr, value in dct.items():
                if callable(value) and not attr.startswith('__') and not attr.startswith('_'):
                    setattr(cls_obj, attr, invariant(value))
        return cls_obj

class Contract(metaclass=MetaContract):... # so you dont write metaclass= in your final code


class Account(Contract):
    __dbc__ = True

    def __init__(self):
        self.balance = 0

    def deposit(self, amount):
        print("hestnoiesanti")
        self.balance += amount

    def withdraw(self, amount):
        self.balance -= amount

    def __invariant__(self):
        print("invariant called")
        assert self.balance >= 0, "Balance must never be negative"


# Instantiate the class
account = Account()
account.deposit(4)
account.withdraw(2)

#here will break the contract
account.withdraw(3)

2 Likes

Thanks for the clear example - it’s a solid illustration of what’s possible today.

The reason I’m exploring a built-in approach is that wrapping methods like this changes how method calls are dispatched - you’re no longer calling the original method, but a wrapper. That can be fine in isolation, but it becomes harder to reason about when combined with inheritance, ORMs, or other tools that also use metaclasses.

It’s not that this can’t be done - it’s that it’s easy to get surprising behavior, and tooling has no visibility into it. Native support could offer a cleaner, more consistent path for those who want it.

Surely this is better done with properties?

class Account:
    def __init__(self):
        self._balance = 0

    @property
    def balance(self):
        return self._balance

    @balance.setter
    def balance(self, value):
        if value < 0:
            raise ValueError("Balance must never be negative")
        self._balance = value

    def deposit(self, amount):
        self.balance += amount

    def withdraw(self, amount):
        self.balance -= amount

This approach:

  • Only revalidates affected attributes rather than the whole class.
  • Prevents in invalid transaction from putting the class into an invalid state (your Account() class would still have a negative balance after raising the exception).
  • Protects direct attribute access (self.balance = -10)
6 Likes

You’re right—properties are great for validating individual fields like balance, and in many cases, that’s all you need.

Invariants go a step further: they validate object-wide consistency. So if multiple fields interact (e.g., balance and available_credit), or consistency depends on internal state across methods, a single __invariant__ check makes it easier to verify the whole object is still valid.

This isn’t about replacing properties - it’s about having a clean way to validate cross-cutting state after any mutation, without having to sprinkle checks everywhere.

Those are fair points, and now I see that you made basically the same thing in your code with exception to add the final class to avoid calling the metaclass.

I can´t really see many disadvantages on the metaclass even when using it with a ORM, if you want, be more explicit there. I see that even a native implementation of this would be opening the door to unwanted side effects by calling a method etc. And for the other possible disadvantages, they depend more in the tooling than in python itself.

In any case, I would feel lucky in your shoes. The feature you want in the language can be implemented in ~20 lines, meaning you (anyone) can already use it, For reference I have several issues with the language that I could not work-around as easily.

2 Likes

Thanks for your thoughtful reply, Pablo!

You’re right, the metaclass-based approach works and is relatively simple. My concern is more about its implicit nature: it can lead to unexpected behavior in large systems or with conflicting libraries. A native implementation could make this more explicit, and provide clearer tooling support for developers not familiar with metaclasses.

I completely agree that this could be implemented today with around 20 lines. But the idea is to standardize the approach for developers who need it, while keeping it optional.

Thanks again for the discussion!

1 Like

No.

It should be an optional add-on, not a built in that everyone has to pay the cost of.

Just to clarify: this isn’t something that would affect users unless they explicitly opt in by defining __invariant__(), much like __slots__, __setattr__, or __init_subclass__.

The goal isn’t to make it mandatory or universally active, but to provide a consistent and lightweight mechanism for those who want it, without relying on decorators or metaclasses.

The feedback so far has been valuable, and I’m already working on a standalone implementation to explore its practicality and limitations before suggesting anything more formal.

1 Like

When would the runtime check for the existence of __invariant?

I would suggest to first demonstrate that this is something that people want to use: Make a third party package, develop plugins to use this in IDEs and show that there is an appetite for this.

If this is somewhat successful, then we can think about adding core support. This is not a feature that fundamentally requires core support (as you noticed), it is similar to asyncio and typing/mypy both of which started out as third party experiments before gaining core support.

1 Like

Only when a class defines __invariant__, and only around public method calls; so essentially a conditional check just before and after those calls.

Think of it like __init_subclass__ or __getattr__: the runtime doesn’t pay the cost unless the hook is there. If __invariant__ isn’t defined, there’s no behavioral change, no performance hit.

The idea is to keep it lightweight, opt-in, and only active where explicitly used.

That’s totally fair, and that’s the path I’m already on with (the Invariant Python library I linked on my original post).

It’s early, but the goal is exactly what you described: see if there’s real appetite for this, get feedback on pain points, and learn whether the current ecosystem can support it cleanly, or whether deeper integration would eventually be justified.

Appreciate the comparison to asyncio and typing; that’s a really helpful frame to keep in mind.

That said, I do think DbC is in a different category: it’s not as well-known or widely understood as things like typing or async, which makes adoption more of an awareness challenge than a technical one.

The lack of existing appetite might not be a sign of disinterest - just unfamiliarity. That’s part of why I’m pushing it into the open: not just to test adoption, but to help surface the conversation around consistency and correctness, which I think matters just as much in the long run.