I don\'t quite understand how iterators have memory in Python.
>>> l1 = [1, 2, 3, 4, 5, 6]
>>> l2 = [2, 3, 4, 5, 6, 7]
>>> iz = iz
Your examples are too simplistic. Consider this:
nums = [1, 2, 3, 4, 5, 6]
nums_it = (n for n in nums)
nums_it is a generator that returns all items unmodified from nums. Clearly you do not have any advantage. But consider this:
squares_it = (n ** 2 for n in nums)
and compare it with:
squares_lst = [n ** 2 for n in nums]
With squares_it, we are generating the squares of nums on the fly only when requested. With squares_lst, we are generating all of them at once and storing them in a new list.
So, when you do:
for n in squares_it:
print(n)
it's like if you were doing:
for n in nums:
print(n ** 2)
But when you do:
for n in squares_lst:
print(n)
it's like if you were doing:
squares_lst = []
for n in nums:
squares_lst.append(n ** 2)
for n in squares_lst:
print(n)
If you don't need (or don't have) the list nums, then you can save even more space by using:
squares_it = (n ** 2 for n in xrange(1, 7))
Generators and iterators also provide another significant advantage (which may actually be a disadvantage, depending on the situation): they are evaluated lazily.
Also, generators and iterators may yield an infinite number of elements. An example is itertools.count() that yields 0, 1, 2, 3, ... without never ending.