I have next code:
from sklearn.model_selection import train_test_split
from scipy.misc import imresize
def _chunks(l, n):
\"\"\"Yield successive n-sized
I found problem source. Firstly - mine dataset fully readed before fit end, so it raises
Exception in thread Thread-50:
Traceback (most recent call last):
File "C:\Anaconda3\Lib\threading.py", line 916, in _bootstrap_inner
self.run()
File "C:\Anaconda3\Lib\threading.py", line 864, in run
self._target(*self._args, **self._kwargs)
File "C:\Users\user\venv\machinelearning\lib\site-packages\keras\utils\data_utils.py", line 560, in data_generator_task
generator_output = next(self._generator)
StopIteration
Exception handlers set stop_event and reraise exception
But :
def get(self):
"""Creates a generator to extract data from the queue.
Skip the data if it is `None`.
# Returns
A generator
"""
while self.is_running():
if not self.queue.empty():
inputs = self.queue.get()
if inputs is not None:
yield inputs
else:
time.sleep(self.wait_time)
So when stop event setted - it's can load data from queue
So I limited max_queue_size to 1.