In MNIST LSTM examples, I don\'t understand what \"hidden layer\" means. Is it the imaginary-layer formed when you represent an unrolled RNN over time?
Why is the <
Following @SangLe answer, I made a picture (see sources for original pictures) showing cells as classically represented in tutorials (Source1: Colah's Blog) and an equivalent cell with 2 units (Source2: Raimi Karim 's post). Hope it will clarify confusion between cells/units and what really is the network architecture.