Implementing Attention in Keras
问题 I am trying to implement attention in keras over a simple lstm: model_2_input = Input(shape=(500,)) #model_2 = Conv1D(100, 10, activation='relu')(model_2_input) model_2 = Dense(64, activation='sigmoid')(model_2_input) model_2 = Dense(64, activation='sigmoid')(model_2) model_1_input = Input(shape=(None, 2048)) model_1 = LSTM(64, dropout_U = 0.2, dropout_W = 0.2, return_sequences=True)(model_1_input) model_1, state_h, state_c = LSTM(16, dropout_U = 0.2, dropout_W = 0.2, return_sequences=True,