How to visualize RNN/LSTM gradients in Keras/TensorFlow?
问题 I've come across research publications and Q&A's discussing a need for inspecting RNN gradients per backpropagation through time (BPTT) - i.e., gradient for each timestep . The main use is introspection : how do we know if an RNN is learning long-term dependencies ? A question of its own topic, but the most important insight is gradient flow : If a non-zero gradient flows through every timestep, then every timestep contributes to learning - i.e., resultant gradients stem from accounting for