问题
I have created a flask service for accepting requests with camera URLs as parameters for finding objects(table, chair etc...) in the camera frame. I have written code in flask for accepting POST requests.
@app.route('/rest/detectObjects', methods=['GET','POST'])
def detectObjects()
...
json_result = function_call_for_detecting_objects()
...
return
In the function, its loads the tf model for object detection and returns the result. A large amount of request needs to be processed simultaneously by the flask server. So I need to execute the function using GPU as the camera access time and image processing for object detection takes much time and CPU utilization. Have a 4 GB GeForce GTX 1050 Ti/PCIe/SSE2. How can I make my python script to make use of GPU for this?
回答1:
To utilize the GPU in python you can use use one of the available libraries: https://www.researchgate.net/post/How_do_I_run_a_python_code_in_the_GPU
CUDA might be the right for a NVIDIA GPU. For a guidance using it with the Anaconda Python distribution see https://weeraman.com/put-that-gpu-to-good-use-with-python-e5a437168c01
回答2:
Installing tensorflow gpu will make the script to detect gpu automatically. if it is not detecting the gpu, check the driver versions(Cuda and cudnn). If no version mismatch or errors occur, then the script can identify the gpu present and will run utilizing the gpu.
来源:https://stackoverflow.com/questions/53650577/how-to-run-python-code-with-support-of-gpu