CUDNN_STATUS_BAD_PARAM when trying to perform inference on a LSTM Seq2Seq with masked inputs
- 阅读更多 关于 CUDNN_STATUS_BAD_PARAM when trying to perform inference on a LSTM Seq2Seq with masked inputs
问题 I'm using keras layers on tensorflow 2.0 to build a simple LSTM-based Seq2Seq model for text generation . versions I'm using: Python 3.6.9, Tensorflow 2.0.0, CUDA 10.0, CUDNN 7.6.1, Nvidia driver version 410.78. I'm aware of the criteria needed by TF to delegate to CUDNNLstm when a GPU is present (I do have a GPU and my model/data fill all these criteria). Training goes smoothly (with a warning message, see the end of this post) and I can verify that CUDNNLstm is being used. However, when I