Why is my GPU slower than CPU when training LSTM/RNN models?

前端 未结 4 1670
爱一瞬间的悲伤
爱一瞬间的悲伤 2020-12-29 02:52

My machine has the following spec:

CPU: Xeon E5-1620 v4

GPU: Titan X (Pascal)

Ubuntu 16.04

Nvidia driver 375.26

CUDA tookit 8.0

4条回答
  •  悲&欢浪女
    2020-12-29 03:09

    It's just a tip.

    Using GPU is powerful when

    1. your neural network model is big.
    2. batch size is big.

    It's what I found from googling.

提交回复
热议问题