Create an LSTM layer with Attention in Keras for multi-label text classification neural network

前端 未结 1 779
梦毁少年i
梦毁少年i 2020-12-20 02:28

Greetings dear members of the community. I am creating a neural network to predict a multi-label y. Specifically, the neural network takes 5 inputs (list of actors, plot sum

相关标签:
1条回答
  • 2020-12-20 02:51

    Let me summarize the intent. You want to add attention to your code. Yours is a sequence classification task and not a seq-seq translator. You dont really care much about the way it is done, so you are ok with not debugging the error above, but just need a working piece of code. Our main input here is the movie reviews consisting of 'n' words for which you want to add attention.

    Assume you embed the reviews and pass it to an LSTM layer. Now you want to 'attend' to all the hidden states of the LSTM layer and then generate a classification (instead of just using the last hidden state of the encoder). So an attention layer needs to be inserted. A barebones implementation would look like this:

        def __init__(self):    
            ##Nothing special to be done here
            super(peel_the_layer, self).__init__()
            
        def build(self, input_shape):
            ##Define the shape of the weights and bias in this layer
            ##This is a 1 unit layer. 
            units=1
            ##last index of the input_shape is the number of dimensions of the prev
            ##RNN layer. last but 1 index is the num of timesteps
            self.w=self.add_weight(name="att_weights", shape=(input_shape[-1], units), initializer="normal") #name property is useful for avoiding RuntimeError: Unable to create link.
            self.b=self.add_weight(name="att_bias", shape=(input_shape[-2], units), initializer="zeros")
            super(peel_the_layer,self).build(input_shape)
            
        def call(self, x):
            ##x is the input tensor..each word that needs to be attended to
            ##Below is the main processing done during training
            ##K is the Keras Backend import
            e = K.tanh(K.dot(x,self.w)+self.b)
            a = K.softmax(e, axis=1)
            output = x*a
            
            ##return the ouputs. 'a' is the set of attention weights
            ##the second variable is the 'attention adjusted o/p state' or context
            return a, K.sum(output, axis=1)
    

    Now call the above Attention layer after your LSTM and before your Dense output layer.

            a, context = peel_the_layer()(lstm_out)
            ##context is the o/p which be the input to your classification layer
            ##a is the set of attention weights and you may want to route them to a display
    

    You can build on top of this as you seem to want to use other features apart for the movie reviews to come up with the final sentiment. Attention largely applies to reviews..and benefits are to be seen if the sentences are very long.

    For more specific details, please refer https://towardsdatascience.com/create-your-own-custom-attention-layer-understand-all-flavours-2201b5e8be9e

    0 讨论(0)
提交回复
热议问题