Why is my GPU slower than CPU when training LSTM/RNN models?

前端 未结 4 1680
爱一瞬间的悲伤
爱一瞬间的悲伤 2020-12-29 02:52

My machine has the following spec:

CPU: Xeon E5-1620 v4

GPU: Titan X (Pascal)

Ubuntu 16.04

Nvidia driver 375.26

CUDA tookit 8.0

4条回答
  •  醉话见心
    2020-12-29 03:06

    If you use Keras, use CuDNNLSTM in place of LSTM or CuDNNGRU in place of GRU. In my case (2 Tesla M60), I am seeing 10x boost of performance. By the way I am using batch size 128 as suggested by @Alexey Golyshev.

提交回复
热议问题