Defined by the language; implementations have no choices here. However they implement it, they must create the illusion that ints are represented in 2’s-complement with an “infinite number” of sign bits.
>>> ~True == -2
... DeprecationWarning elided ...
True
>>> ~False == -1
... DeprecationWarning elided ...
True
Those must be true under all Python implementations, as also are things like -2 & 7 == 6. Every bit is defined. The actual representation isn’t exposed. For example, CPython happens to store ints as an absolute value plus a distinct “sign” bit, but that’s invisible at the language level.
People mistakenly using “~” for logical “not” have deeper conceptual gaps than a deprecation warning can fill ![]()