How do I create a variable-length input LSTM in Keras?

前端 未结 3 441
悲&欢浪女
悲&欢浪女 2020-12-04 09:08

I am trying to do some vanilla pattern recognition with an LSTM using Keras to predict the next element in a sequence.

My data look like this:

where

3条回答
  •  猫巷女王i
    2020-12-04 10:06

    I am not clear about the embedding procedure. But still here is a way to implement a variable-length input LSTM. Just do not specify the timespan dimension when building LSTM.

    import keras.backend as K
    from keras.layers import LSTM, Input
    
    I = Input(shape=(None, 200)) # unknown timespan, fixed feature size
    lstm = LSTM(20)
    f = K.function(inputs=[I], outputs=[lstm(I)])
    
    import numpy as np
    data1 = np.random.random(size=(1, 100, 200)) # batch_size = 1, timespan = 100
    print f([data1])[0].shape
    # (1, 20)
    
    data2 = np.random.random(size=(1, 314, 200)) # batch_size = 1, timespan = 314
    print f([data2])[0].shape
    # (1, 20)
    

提交回复
热议问题