Will scikit-learn utilize GPU?

a 夏天 提交于 2019-12-02 21:48:21

Tensorflow only uses GPU if it is built against Cuda and CuDNN. By default none of both are going to use GPU, especially if it is running inside Docker, unless you use nvidia-docker and an image capable of doing it.

Scikit-learn is not intended to be used as a deep-learning framework, and seems that it doesn't support GPU computations.

Why is there no support for deep or reinforcement learning / Will there be support for deep or reinforcement learning in scikit-learn?

Deep learning and reinforcement learning both require a rich vocabulary to define an architecture, with deep learning additionally requiring GPUs for efficient computing. However, neither of these fit within the design constraints of scikit-learn; as a result, deep learning and reinforcement learning are currently out of scope for what scikit-learn seeks to achieve.

Extracted from http://scikit-learn.org/stable/faq.html#why-is-there-no-support-for-deep-or-reinforcement-learning-will-there-be-support-for-deep-or-reinforcement-learning-in-scikit-learn

Will you add GPU support in scikit-learn?

No, or at least not in the near future. The main reason is that GPU support will introduce many software dependencies and introduce platform specific issues. scikit-learn is designed to be easy to install on a wide variety of platforms. Outside of neural networks, GPUs don’t play a large role in machine learning today, and much larger gains in speed can often be achieved by a careful choice of algorithms.

Extracted from http://scikit-learn.org/stable/faq.html#will-you-add-gpu-support

Simply No.

Take a look at FAQ "Will you add GPU support?" provided by scikit-learn here. They have given a clear explanation of why.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!