Keras predict loop memory leak using tf.data.Dataset but not with a numpy array
问题 I encounter a memory leak and decreasing performance when looping over a Keras model predict function when using a tf.data.Dataset to feed the model, but not when feeding it with a numpy array. Does anyone understand what is causing this and/or how to resolve the issue? Minimal reproducible code snippet (copy/paste runnable): import tensorflow as tf import numpy as np import time SIZE = 5000 inp = tf.keras.layers.Input(shape=(SIZE,), dtype='float32') x = tf.keras.layers.Dense(units=SIZE)(inp)