问题
I have some use cases in which I need to run generator functions without caring about the yielded items.
I cannot make them non-generaor functions because in other use cases I certainly need the yielded values.
I am currently using a trivial self-made function to exhaust the generators.
def exhaust(generator):
for _ in generator:
pass
I wondered, whether there is a simpler way to do that, which I'm missing?
Edit Following a use case:
def create_tables(fail_silently=True):
"""Create the respective tables."""
for model in MODELS:
try:
model.create_table(fail_silently=fail_silently)
except Exception:
yield (False, model)
else:
yield (True, model)
In some context, I care about the error and success values…
for success, table in create_tables():
if success:
print('Creation of table {} succeeded.'.format(table))
else:
print('Creation of table {} failed.'.format(table), file=stderr)
… and in some I just want to run the function "blindly":
exhaust(create_tables())
回答1:
Setting up a for loop for this could be relatively expensive, keeping in mind that a for loop in Python is fundamentally successive execution of simple assignment statements; you'll be executing n (number of items in generator) assignments, only to discard the assignment targets afterwards.
You can instead feed the generator to a zero length deque
; consumes at C-speed and does not use up memory as with list
and other callables that materialise iterators/generators:
from collections import deque
def exhaust(generator):
deque(generator, maxlen=0)
Taken from the consume
itertools recipe.
回答2:
Based on your use case it's hard to imagine that there would be sufficiently many tables to create that you would need to consider performance.
Additionally, table creation is going to be much more expensive than iteration.
So the for loop that you already have would seem the simplest and most Pythonic solution - in this case.
来源:https://stackoverflow.com/questions/47456631/simpler-way-to-run-a-generator-function-without-caring-about-items