How can I make custom exceptions show the public package path in Python tracebacks?

I have a Python library where everything is exposed through the top-level import.

My exceptions are defined in src/foobar/_errors.py and then re-exported in src/foobar/__init__.py, so they can be used like this in user code:

from foobar import MySpecificFooBarError

For example, in _errors.py I have something like:

class MyBaseFooBarError(Exception):
    pass

class MySpecificFooBarError(MyBaseFooBarError):
    pass

This part works fine, and it’s documented.

The problem is with tracebacks. When one of these exceptions is raised, the traceback shows the fully qualified name from the private module, e.g.:

Traceback (most recent call last):
  [...]
foobar._errors.MySpecificFooBarError: Oops!

This is technically correct, but it’s not what I want. Users find it confusing because they’ve been told to use foobar.MySpecificFooBarError, not foobar._errors.MySpecificFooBarError (I got an issue about this, which is why I’m making this post in the first place).

My current solution

Right now, I manually override the metadata on each exception:

class MySpecificFooBarError(MyBaseFooBarError):
    __module__ = "foobar"
    __qualname__ = "MySpecificFooBarError"

This works, but it’s not ideal. What I’d really like is to define this behavior once in MyBaseFooBarError and have subclasses automatically inherit it. At the moment, I can’t seem to get that working, so I need to repeat the __module__ and __qualname__ assignments for every error class.

Question

Is there a way to make Python show the public import path (foobar.MySpecificFooBarError) in tracebacks without having to redefine __module__/__qualname__ in every subclass, and without moving all exception definitions into __init__.py?

3 Likes

I don’t think you should do this. Classes are often defined in places different from where you should import them; users need to get used to this.

By changing the class repr in this way you make it harder to figure out where to look for the class. Especially for pure python libraries I commonly want to navigate to the definition of a class to e.g. better read docstrings, additional comments or method definitions.

However, if you really want to this, using __init_subclass__ should work.

3 Likes

I have the same problem in the lmstudio-python SDK (we push the import lmstudio as lms import style in all our examples, and exposing the full internal module names to runtime introspection confuses that recommendation).

My solution is an @sdk_public_type class decorator

I leave __qualname__ alone as that reports lexical nesting within the file rather than anything related to the module structure.

I personally mostly care about this for type hinting import purposes (if folks turn an inferred type hint into an explicit annotation, I want their IDE to suggest the public import name, not the internal implementation location), but I do also use it to make exception tracebacks suggest the desired import to catch reported exceptions.

2 Likes

I get your point, and I do realize that. My main motivation here is improving the readability of tracebacks for users who aren’t Python devs. IDEs will still navigate directly to the class definition, so the dev experience doesn’t really change there.

For someone relying on grep, it does add an extra step (e.g. grep MyError foobar/__init__.py → see the re-export → follow it to the actual file). That’s not ideal, but I think it’s fine since anyone going that route is already a developer and can figure it out.

Thanks for the example and especially the __qualname__ tidbit, I didn’t realize that distinction.

In my case I went with the __init_subclass__ approach (thanks again @MegaIng), so I can centralize the behavior at the base error class level:

class MyBaseFooBarError(Exception):
    __module__ = "foobar"
    def __init_subclass__(cls) -> None:
        cls.__module__ = "foobar"

class MySpecificFooBarError(MyBaseFooBarError):
    pass
1 Like