I know you have Unix, and POSIX time that measure time as the number of seconds that have passed since Thursday 1 January 1970, and Windows NT systems measure time as the number of 100-nanosecond intervals that have passed since 1 January 1601. What I’m use to is ever 1000 nanoseconds being 1 second.
Python uses microseconds which is 1000000 being 1 second. I’m curious to know what python time is base on.