lstm-stateful

keras LSTM model - a tf 1.15 equivalent that works with tflite

喜夏-厌秋 提交于 2021-02-11 12:44:49
问题 TLDR : How to implement this model using tf.lite.experimental.nn.TFLiteLSTMCell, tf.lite.experimental.nn.dynamic_rnn instead keras.layers.LSTM ? I have this network in keras: inputs = keras.Input(shape=(1, 52)) state_1_h = keras.Input(shape=(200,)) state_1_c = keras.Input(shape=(200,)) x1, state_1_h_out, state_1_c_out = layers.LSTM(200, return_sequences=True, input_shape=(sequence_length, 52), return_state=True)(inputs, initial_state=[state_1_h, state_1_c]) output = layers.Dense(13)(x1) model

Keras Lstm predicting next item, taking whole sequences or sliding window. Will sliding window need stateful LSTM?

半城伤御伤魂 提交于 2020-12-13 03:27:07
问题 I have a sequence prediction problem in which, given the last n items in a sequence I need to predict next item. I have more than 2 million sequences each with different timesteps ( length of sequence ), like some are just 5 and some are 50/60/100/200 upto 500. seq_inputs = [ ["AA1", "BB3", "CC4",…,"DD5"], #length/timeteps 5 ["FF1", "DD3", "FF6","KK8","AA5", "CC8",…, "AA2"] #length/timeteps 50 ["AA2", "CC8", "CC11","DD3", "FF6","AA1", "BB3",……,”DD11”]#length/timesteps 200 .. .. ] # there are

Keras Lstm predicting next item, taking whole sequences or sliding window. Will sliding window need stateful LSTM?

限于喜欢 提交于 2020-12-13 03:24:24
问题 I have a sequence prediction problem in which, given the last n items in a sequence I need to predict next item. I have more than 2 million sequences each with different timesteps ( length of sequence ), like some are just 5 and some are 50/60/100/200 upto 500. seq_inputs = [ ["AA1", "BB3", "CC4",…,"DD5"], #length/timeteps 5 ["FF1", "DD3", "FF6","KK8","AA5", "CC8",…, "AA2"] #length/timeteps 50 ["AA2", "CC8", "CC11","DD3", "FF6","AA1", "BB3",……,”DD11”]#length/timesteps 200 .. .. ] # there are