Bi-LSTM Attention model in Keras
问题 I am trying to make an attention model with Bi-LSTM using word embeddings. I came across How to add an attention mechanism in keras?, https://github.com/philipperemy/keras-attention-mechanism/blob/master/attention_lstm.py and https://github.com/keras-team/keras/issues/4962. However, I am confused about the implementation of Attention-Based Bidirectional Long Short-Term Memory Networks for Relation Classification . So, _input = Input(shape=[max_length], dtype='int32') # get the embedding layer