Why is math.gcd(-1,0) == math.gcd(0,1) == 1?

I started writing gcd() for whatever reason and then compared with math.gcd() and it’s surprising that math.gcd(-1,0) says that 1 is a common denominator between -1 and 0.

def gcd(a : int, b : int) -> int:
    if a == 0 or b == 0:
        return 0
    if a == 1 or b == 1:
        return 1
    _a, _b = (a,b) if a > b else (b,a)
    for n in (_b, math.floor(_b/2) if not _b % 2 else _b, *range(math.floor(_b/2), 1, -1)):
        if not _a % n and not _b % n:
            return abs(n)  # this appears to match math.gcd()
            # return n     # this has undefined behavior with negative-valued inputs
    return 1

assert gcd(10, 20) == 10 == math.gcd(10,20)
assert gcd(20, 10) == 10 == math.gcd(20,10)

assert gcd(60, 20) == 2*2*5
assert gcd(20, 60) == 2*2*5

assert gcd(3, 11) == 1 == math.gcd(3, 11)
assert gcd(3, 3) == 3
assert gcd(3, 1) == 1 == math.gcd(3,1)
assert gcd(11, 3) == 1

assert gcd(512, 32) == 32 == math.gcd(512, 32)
assert gcd(512, 64) == 64 

assert gcd(2,4) == 2
assert gcd(4,2) == 2

assert gcd(0,0) == 0 == math.gcd(0,0)

assert gcd(-1, 0) == 0
assert gcd(0, -1) == 0

assert gcd(-10,20) == 10 == math.gcd(-10, 20)

assert gcd(37, 71) == 1 == math.gcd(37,71), gcd(37, 71)

for ab in (*itertools.combinations(range(1, 1000), r=2), (0,-1), (1,0) ):
    #display(dict(ab=ab))
    assert gcd(*ab) == math.gcd(*ab), (dict(ab=ab, gcd=gcd(*ab), mathgcd=math.gcd(*ab)))
AssertionError: {'ab': (0, -1), 'gcd': 0, 'mathgcd': 1}

So, (1) is this the case with all versions of python?; (2) should it be; does the math definition of gcd() say that 0 is ever a common denominator?; (3) and can or should math.gcd() be changed?

import math
assert math.gcd(0,-1) == 1
assert math.gcd(1, 0) == 1
assert math.gcd(0, 0) == 0
assert math.gcd(0, 1) == 1
assert math.gcd(-1, 0) == 1
# also fwiw:
assert math.gcd(-1, 1) == 1
assert math.gcd(1, -1) == 1

This follows from the definition of the gcd as given on wikipedia

When one of a and b is zero, the GCD is the absolute value of the nonzero integer: gcd(a, 0) = gcd(0, a) = |a|. This case is important as the terminating step of the Euclidean algorithm.

So the answer your questions: (1) Yes, (2) Yes, (3) No and No

5 Likes
def gcd(a, b):
    """Calculate the Greatest Common Divisor (GCD) of two numbers using the Euclidean algorithm.

    :param int a: First number
    :param int b: Second number
    :return: The GCD of a and b
    :rtype: int
    """
    while b != 0:
        a, b = b, a % b
    return a