问题
Consider these two functions:
def foo():
x = 0
while True:
yield x
x += 1
def wrap_foo(limit=10, gen=True):
fg = foo()
count = 0
if gen:
while count < limit:
yield next(fg)
count += 1
else:
return [next(fg) for _ in range(limit)]=
foo()
is a generator, and wrap_foo()
just puts a limit on how much data gets generated. I was experimenting with having the wrapper behave as a generator with gen=True
, or as a regular function that puts all generated data into memory directly with the kwarg gen=False
.
The regular generator behavior works as I'd expect:
In [1352]: [_ for _ in wrap_foo(gen=True)]
Out[1352]: [0, 1, 2, 3, 4, 5, 6, 7, 8, 9]
However, with gen=False
, nothing gets generated.
In [1351]: [num for num in wrap_foo(gen=False)]
Out[1351]: []
It seems like Python pre-classifies the function as a generator based on the presence of the yield
statement (latter example works perfectly if yield
is commented out).
Why is this? I would like to understand the mechanisms at play here. I'm running 3.6
回答1:
It seems like Python pre-classifies the function as a generator based on the presence of the yield statement
Yes, that's exactly what happens. wrap_foo
is determined to be a generator at function definition time. You could consider using generator expressions instead:
def wrap_foo(limit=10, gen=True):
fg = foo()
if gen:
return (next(fg) for _ in range(limit))
else:
return [next(fg) for _ in range(limit)]
回答2:
It seems like Python pre-classifies the function as a generator based on the presence of the yield statement (latter example works perfectly if yield is commented out).
Why is this?
Because Python can't wait until the function actually executes a yield
to decide whether it's a generator. First, generators are defined to not execute any of their code until the first next
. Second, a generator might never actually reach any of its yield
statements, if it happens to not generate any elements.
来源:https://stackoverflow.com/questions/42540630/why-cant-you-toggle-a-function-generators-behavior-by-an-argument