What is num_units in tensorflow BasicLSTMCell?
In MNIST LSTM examples, I don't understand what "hidden layer" means. Is it the imaginary-layer formed when you represent an unrolled RNN over time? Why is the num_units = 128 in most cases ? I know I should read colah's blog in detail to understand this, but, before that, I just want to get some code working with a sample time series data I have. nobar The number of hidden units is a direct representation of the learning capacity of a neural network -- it reflects the number of learned parameters . The value 128 was likely selected arbitrarily or empirically. You can change that value