In MNIST LSTM examples, I don\'t understand what \"hidden layer\" means. Is it the imaginary-layer formed when you represent an unrolled RNN over time?
I think it is confusing for TF users by the term "num_hidden". Actually it has nothing to do with the unrolled LSTM cells, and it just is the dimension of the tensor, which is transformed from the time-step input tensor to and fed into the LSTM cell.