Does “tf.config.experimental.set_synchronous_execution” make the Python tensorflow lite interpreter use multiprocessing?
问题 I am using Python to do object detection in a video stream. I have a TensorFlow Lite model which takes a relatively long time to evaluate. Using interpreter.invoke() , it takes about 500 ms per evaluation. I'd like to use parallelism to get more evaluations per second. I see that I can call the TensorFlow config tf.config.experimental.set_synchronous_execution . I was hoping that setting this would magically cause the interpreter to run in multiple processes. However, running help(tf.lite