How can I use GPU for running a tflite model (*.tflite) using tf.lite.Interpreter (in python)?

大城市里の小女人 提交于 2021-02-10 07:25:36

问题


I have converted a tensorflow inference graph to tflite model file (*.tflite), according to instructions from https://www.tensorflow.org/lite/convert.

I tested the tflite model on my GPU server, which has 4 Nvidia TITAN GPUs. I used the tf.lite.Interpreter to load and run tflite model file.

It works as the former tensorflow graph, however, the problem is that the inference became too slow. When I checked out the reason, I found that the GPU utilization is simply 0% when tf.lite.Interpreter is running.

Is there any method that I can run tf.lite.Interpreter with GPU support?


回答1:


https://github.com/tensorflow/tensorflow/issues/34536

CPU is kind of good enough for tflite, especially multicore.

nvidia GPU likely not updated for tflite, which is for mobile GPU platform.



来源:https://stackoverflow.com/questions/57801680/how-can-i-use-gpu-for-running-a-tflite-model-tflite-using-tf-lite-interprete

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!