I have some use cases in which I need to run generator functions without caring about the yielded items.
I cannot make them non-generaor functions because in other use c
You could just have two functions that each do one thing and call the appropriate one at the appropriate time?
def create_table(model, fail_silently=True):
"""Create the table."""
try:
model.create_table(fail_silently=fail_silently)
except Exception:
return (False, model)
else:
return (True, model)
def create_tables(MODELS)
for model in MODELS:
create_table(model)
def iter_create_tables(MODELS)
for model in MODELS:
yield create_table(model)
When you care about the returned values do:
for success, table in iter_create_tables(MODELS):
if success:
print('Creation of table {} succeeded.'.format(table))
else:
print('Creation of table {} failed.'.format(table), file=stderr)
when you don't just do
create_tables(MODELS)
One very simple and possibly efficient solution could be
def exhaust(generator): all(generator)
if we can assume that generator will always return True (as in your case where a tuple of 2 elements (success,table) is true even if success and table both are False), or: any(generator) if it will always return False, and in the "worst case", all(x or True for x in generator).
Being that short & simple, you might not even need a function for it!
Regarding the "why?" comment (I dislike these...): There are many cases where one may want to exhaust a generator. To cite just one, it's a way of doing a for loop as an expression, e.g., any(print(i,x) for i,x in enumerate(S)) - of course there are less trivial examples.
Setting up a for loop for this could be relatively expensive, keeping in mind that a for loop in Python is fundamentally successive execution of simple assignment statements; you'll be executing n (number of items in generator) assignments, only to discard the assignment targets afterwards.
You can instead feed the generator to a zero length deque; consumes at C-speed and does not use up memory as with list and other callables that materialise iterators/generators:
from collections import deque
def exhaust(generator):
deque(generator, maxlen=0)
Taken from the consume itertools recipe.
Based on your use case it's hard to imagine that there would be sufficiently many tables to create that you would need to consider performance.
Additionally, table creation is going to be much more expensive than iteration.
So the for loop that you already have would seem the simplest and most Pythonic solution - in this case.