FWIW here is a real-world example from a large library that wraps Numpy for GPU support. I used None here because that is what the wrapped API uses, and we wanted to match it precisely. But usually I wanted to be more careful in similar situations, I would define an Unset class to use.
def setflags(
self,
write: Union[bool, None] = None,
align: Union[bool, None] = None,
uic: Union[bool, None] = None,
) -> None:
...
# Be a bit more careful here, and only pass params that are explicitly
# set by the caller. The numpy interface specifies only bool values,
# despite its None defaults.
kws = {}
if write is not None:
kws["write"] = write
if align is not None:
kws["align"] = align
if uic is not None:
kws["uic"] = uic
self.__array__().setflags(**kws)
I donât really see what a built-in Default (or Unset or whatever) would buy here. If the goal is to avoid passing unset parameters in order that called-functions use their own defaults, then you still have to go through the work to actually exclude the unset values, since in general you wouldnât be able to assume the function you call inside to know anything about Default or how to handle it.
What might be useful is a way to get all the passed params as a dict, so that it could just be easily filtered. Then you could write the above:
def setflags(
self,
write: Union[bool, None] = None,
align: Union[bool, None] = None,
uic: Union[bool, None] = None,
) -> None:
kws = {k: v for k, v in __kwargs__.items() if v is not None} # or Unset, etc
self.__array__().setflags(**kws)
where â__kwargs__â means roughly: âall this functions params in one dict regardless of how they were suppliedâ.