How to calculate the number of parameters of an LSTM network?

孤人 提交于 2019-11-28 05:32:39

No - the number of parameters of a LSTM layer in Keras equals to:

params = 4 * ((size_of_input + 1) * size_of_output + size_of_output^2)

Additional 1 comes from bias terms. So n is size of input (increased by the bias term) and m is size of output of a LSTM layer.

So finally :

4 * (4097 * 256 + 256^2) = 4457472

image via this post

num_params = [(num_units + input_dim + 1) * num_units] * 4

num_units + input_dim: concat [h(t-1), x(t)]

+ 1: bias

* 4: there are 4 neural network layers (yellow box) {W_forget, W_input, W_output, W_cell}

model.add(LSTM(units=256, input_dim=4096, input_length=16))

[(256 + 4096 + 1) * 256] * 4 = 4457472

PS: num_units = num_hidden_units = output_dims

Formula expanding for @JohnStrong :

4 means we have different weight and bias variables for 3 gates (read / write / froget) and - 4-th - for the cell state (within same hidden state). (These mentioned are shared among timesteps along particular hidden state vector)

4 * lstm_hidden_state_size * (lstm_inputs_size + bias_variable + lstm_outputs_size) 

as LSTM output (y) is h (hidden state) by approach, so, without an extra projection, for LSTM outputs we have :

lstm_hidden_state_size = lstm_outputs_size 

let's say it's d :

d = lstm_hidden_state_size = lstm_outputs_size 

Then

params = 4 * d * ((lstm_inputs_size + 1) + d) = 4 * ((lstm_inputs_size + 1) * d + d^2)

LSTM Equations (via deeplearning.ai Coursera)

  • It is evident from the equations that the final dimensions of all the 6 equations will be same and      final dimension must necessarily be equal to the dimension of a(t).

  • Out of these 6 equations, only 4 equations contribute to the number of parameters and by      looking at the equations, it can be deduced that all the 4 equations are symmetric. So,if we find      out the number of parameters for 1 equation, we can just multiply it by 4 and tell the total number      of parameters.

  • One important point is to note that the total number of parameters doesn't depend on the      time-steps(or input_length) as same "W" and "b" is shared throughout the time-step.

  • Assuming, insider of LSTM cell having just one layer for a gate(as that in Keras).

  • Take equation 1 and lets relate. Let number of neurons in the layer be n and number of      dimension of x be m (not including number of example and time-steps). Therefore, dimension of      forget gate will be n too. Now,same as that in ANN, dimension of "Wf" will be n*(n+m) and      dimension of "bf" will be n. Therefore, total number of parameters for one equation will be       [{n*(n+m)} + n]. Therefore, total number of parameters will be 4*[{n*(n+m)} + n].Lets open the      brackets and we will get -> 4*(nm + n2 + n).

  • So,as per your values. Feeding it into the formula gives:->(n=256,m=4096),total number of      parameters is 4*((256*256) + (256*4096) + (256) ) = 4*(1114368) = 4457472.
  • I think it would be easier to understand if we start with a simple RNN.

    Let's assume that we have 4 units (please ignore the ... in the network and concentrate only on visible units), and the input size (number of dimensions) is 3:

    The number of weights is 28 = 16 (num_units * num_units) for the recurrent connections + 12 (input_dim * num_units) for input. The number of biases is simply num_units.

    Recurrency means that each neuron output is fed back into the whole network, so if we unroll it in time sequence, it looks like two dense layers:

    and that makes it clear why we have num_units * num_units weights for the recurrent part.

    The number of parameters for this simple RNN is 32 = 4 * 4 + 3 * 4 + 4, which can be expressed as num_units * num_units + input_dim * num_units + num_units or num_units * (num_units + input_dim + 1)

    Now, for LSTM, we must multiply the number of of these parameters by 4, as this is the number of sub-parameters inside each unit, and it was nicely illustrated in the answer by @FelixHo

    易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
    该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!