Number of parameters for Keras SimpleRNN

后端 未结 3 400
礼貌的吻别
礼貌的吻别 2020-12-11 05:53

I have a simpleRNN like:

model.add(SimpleRNN(10, input_shape=(3, 1)))
model.add(Dense(1, activation=\"linear\"))


        
相关标签:
3条回答
  • 2020-12-11 06:16

    When you look at the headline of the table you see the title Param:

    Layer (type)              Output Shape   Param 
    ===============================================
    simple_rnn_1 (SimpleRNN)   (None, 10)    120   
    

    This number represents the number of trainable parameters (weights and biases) in the respective layer, in this case your SimpleRNN.

    Edit:

    The formula for calculating the weights is as follows:

    recurrent_weights + input_weights + biases

    *resp: (num_features + num_units)* num_units + num_units

    Explanation:

    num_units = equals the number of units in the RNN

    num_features = equals the number features of your input

    Now you have two things happening in your RNN.

    First you have the recurrent loop, where the state is fed recurrently into the model to generate the next step. Weights for the recurrent step are:

    recurrent_weights = num_units*num_units

    The secondly you have new input of your sequence at each step.

    input_weights = num_features*num_units

    (Usually both last RNN state and new input are concatenated and then multiplied with one single weight matrix, nevertheless inputs and last RNN state use different weights)

    So now we have the weights, whats missing are the biases - for every unit one bias:

    biases = num_units*1

    So finally we have the formula:

    recurrent_weights + input_weights + biases

    or

    num_units* num_units + num_features* num_units + biases

    =

    (num_features + num_units)* num_units + biases

    In your cases this means the trainable parameters are:

    10*10 + 1*10 + 10 = 120

    I hope this is understandable, if not just tell me - so I can edit it to make it more clear.

    0 讨论(0)
  • 2020-12-11 06:21

    I visualize the SimpleRNN you add, I think the figure can explain a lot.

    SimpleRNN layer, I'm a newbie here, can't post images directly, so you need to click the link.

    From the unrolled version of SimpleRNN layer,it can be seen as a dense layer. And the previous layer is a concatenation of input and the current layer(previous step) itself.

    So the number of parameters of SimpleRNN can be computed as a dense layer:

    num_para = units_pre * units + num_bias

    where:

    units_pre is the sum of input neurons(1 in your settings) and units(see below),

    units is the number of neurons(10 in your settings) in the current layer,

    num_bias is the number of bias term in the current layer, which is the same as the units.

    Plugging in your settings, we achieve the num_para = (1 + 10) * 10 + 10 = 120.

    0 讨论(0)
  • 2020-12-11 06:27

    It might be easier to understand visually with a simple network like this:

    The number of weights is 16 (4 * 4) + 12 (3 * 4) = 28 and the number of biases is 4.

    where 4 is the number of units and 3 is the number of input dimensions, so the formula is just like in the first answer: num_units ^ 2 + num_units * input_dim + num_units or simply num_units * (num_units + input_dim + 1), which yields 10 * (10 + 1 + 1) = 120 for the parameters given in the question.

    0 讨论(0)
提交回复
热议问题