Initializing LSTM hidden state Tensorflow/Keras

前端 未结 4 800
盖世英雄少女心
盖世英雄少女心 2020-12-08 17:50

Can someone explain how can I initialize hidden state of LSTM in tensorflow? I am trying to build LSTM recurrent auto-encoder, so after i have that model trained i want to t

4条回答
  •  伪装坚强ぢ
    2020-12-08 18:31

    As stated in the Keras API documentation for recurrent layers (https://keras.io/layers/recurrent/):

    Note on specifying the initial state of RNNs

    You can specify the initial state of RNN layers symbolically by calling them with the keyword argument initial_state. The value of initial_state should be a tensor or list of tensors representing the initial state of the RNN layer.

    You can specify the initial state of RNN layers numerically by calling reset_states with the keyword argument states. The value of states should be a numpy array or list of numpy arrays representing the initial state of the RNN layer.

    Since the LSTM layer has two states (hidden state and cell state) the value of initial_state and states is a list of two tensors.


    Examples

    Stateless LSTM

    Input shape: (batch, timesteps, features) = (1, 10, 1)
    Number of units in the LSTM layer = 8 (i.e. dimensionality of hidden and cell state)

    import tensorflow as tf
    import numpy as np
    
    inputs = np.random.random([1, 10, 1]).astype(np.float32)
    
    lstm = tf.keras.layers.LSTM(8)
    
    c_0 = tf.convert_to_tensor(np.random.random([1, 8]).astype(np.float32))
    h_0 = tf.convert_to_tensor(np.random.random([1, 8]).astype(np.float32))
    
    outputs = lstm(inputs, initial_state=[h_0, c_0])
    

    Stateful LSTM

    Input shape: (batch, timesteps, features) = (1, 10, 1)
    Number of units in the LSTM layer = 8 (i.e. dimensionality of hidden and cell state)

    Note that for stateful lstm you need to specify also batch_size.

    import tensorflow as tf
    import numpy as np
    from pprint import pprint
    
    inputs = np.random.random([1, 10, 1]).astype(np.float32)
    
    lstm = tf.keras.layers.LSTM(8, stateful=True, batch_size=(1, 10, 1))
    
    c_0 = tf.convert_to_tensor(np.random.random([1, 8]).astype(np.float32))
    h_0 = tf.convert_to_tensor(np.random.random([1, 8]).astype(np.float32))
    
    outputs = lstm(inputs, initial_state=[h_0, c_0])
    

    With a Stateful LSTM, the states are not reset at the end of each sequence and we can notice that the output of the layer correspond to the hidden state (i.e. lstm.states[0]) at the last timestep:

    >>> pprint(outputs)
    
    >>>
    >>> pprint(lstm.states)
    [,
     ]
    

    Calling reset_states() it is possible to reset the states:

    >>> lstm.reset_states()
    >>> pprint(lstm.states)
    [,
     ]
    >>>
    

    or to set them to a specific value:

    >>> lstm.reset_states(states=[h_0, c_0])
    >>> pprint(lstm.states)
    [,
     ]
    >>>
    >>> pprint(h_0)
    
    >>>
    >>> pprint(c_0)
    
    >>>
    

提交回复
热议问题