I define an object at line 9. Why the objects appended into the list are the same if I do not redefine it at line 14?
self.name = ''
self.age = 0
animals = 
animal = Animal() # line 9
animal.name = 'snake'
animal.age = 3
animal = Animal() # line 14, redefine?
animal.name = 'monkey'
animal.age = 5
Because appending onto a list doesn’t make a copy of the object. It just references the object. If you modify the original object, the view of it from the list is modified as well.
In line 14, the
Animal() call creates a new, independent object.
But the basic types like number, string do make copies, right?
Is class type a reference type in Python? (Isn’t it just script language anyway)
animals = 
animal = 'snake'
# animal = str() # not needed
animal = 'monkey'
for animal in animals:
No, the beauty of Python is that there are no “basic types”. Everything is an object, and every reference is the same.
Of course, some types (like your “number, string”) are immutable, so you usually can’t detect whether you get a copy or the same object, and Python does cheat sometimes in the name of efficiency (for example, small numbers are usually cached).
I’m not sure what you mean by class type being a reference type (and what it has to do with being a scripting language). In Python every type is a class, and every class is a type – there is no material difference in those two. Also, every class is an object, and as such it can be referenced – that is, it can be given a name in any particular namespace.
Every value, everything, in Python is a reference type. Lists,
strings, ints, floats, None, True, False, everything.
The important factor is objects that can be modified, and those that
can only be replaced by a new object.
x =  # a mutable object that can be modified
y = x # 'x' and 'y' are two names for the same list object
x.append(1) # modifies the existing list object
print(y) # prints 
x = 22 # an immutable, unmodifiable object
y = x # 'x' and 'y' are two names for the same int object
x += 1 # does not modify the int but binds a new int, 23, to 'x'
print(y) # prints 22, not 23
It’s a bit of complicated. I think Apple’s Swift is easier to understand than Python.
If you have preconcieved notions from languages like Java, then probably Swift is easier to understand. But if you just look at it by itself, Python’s model couldn’t be easier – it’s exactly how the real world works. Our naming doesn’t affect objects, nothing is copied implicitly, objects are modified by changing their attributes, or they are replaced by new ones and letting the garbage collector take care of the old ones.
If you understand reference types in Swift, you should understand reference types in Python.
In Python, you never have to worry whether a value is a reference type or a value type, since the answer is always the same. In Swift, you always have to ask.
Java values are like Swift values: there are two kinds of values.
Java “machine values” or “unboxed values” are like Swift value types.
Java “objects” and “boxed values” are like Swift reference types. Both are the same as everything in Python.
I don’t think Swift is easier.
Not only do you have to worry about the difference between reference and value types, but you have to worry about passing values with
& or not, and about
Python has one single data model: everything is an object. Swift and Java have two: some values are objects, some values are not objects.
I’m not quite sure why you’re replying to me… it seems to me you’re saying the same thing I’m saying. If you already know Java, then Swift is easier. But in absolute sense, of course Python’s model has fewer new concepts to understand.