What is the behavior of .peek() with buffer_size=None?
I have been trying to implement this as well as evolve Buffed I/O in general a bit (see: Reworking "Buffered I/O" in CPython, I refer to this as buffer_size=0 / buffering=0). The code internally is structured to require a buffer at the moment but that is definitely changeable.
Seems to me you need a minimum buffer size of 1 for peek() to work, at least until the peeked character is read. You could temporarily change the buffer size behind the scenes, but that could break code that has requested no buffering for a reason. I’d vote for simply disallowing peek() if buffer_size = 0.
Note that what OP asked for is not for buffer_size=None to be the same as buffer_size=0. They ask for it to be equivalent to not providing buffer_size at all. This is ~always a good idea to simplify the process of creating wrappers, but the fact that buffer_size=0 has a different meaning makes it IMO not an obvious choice.
Re-reading I agree that seems to be the initial ask. Making buffer_size=None mean “use the default” I can see but to me that reads ambiguously with “None means no buffer”