Does Numpy automatically detect and use GPU?

坚强是说给别人听的谎言 提交于 2020-06-09 16:47:41

问题


I have a few basic questions about using Numpy with GPU (nvidia GTX 1080 Ti). I'm new to GPU, and would like to make sure I'm properly using the GPU to accelerate Numpy/Python. I searched on the internet for a while, but didn't find a simple tutorial that addressed my questions. I'd appreciate it if someone can give me some pointers:

1) Does Numpy/Python automatically detect the presence of GPU and utilize it to speed up matrix computation (e.g. numpy.multiply, numpy.linalg.inv, ... etc)? Or do I have code in a specific way to exploit the GPU for fast computation?

2) Can someone recommend a good tutorial/introductory material on using Numpy/Python with GPU (nvidia's)?

Thanks a lot!


回答1:


Does Numpy/Python automatically detect the presence of GPU and utilize it to speed up matrix computation (e.g. numpy.multiply, numpy.linalg.inv, ... etc)?

No.

Or do I have code in a specific way to exploit the GPU for fast computation?

Yes. Search for Numba, Theano, PyTorch or PyCUDA for different paradigms for accelerating Python with GPUs.




回答2:


No, you can also use CuPy which has a similar interface with numpy. https://cupy.chainer.org/




回答3:


JAX uses XLA to compile NumPy code to run on GPUs/ TPUs : https://github.com/google/jax



来源:https://stackoverflow.com/questions/49605231/does-numpy-automatically-detect-and-use-gpu

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!