Using pretrained gensim Word2vec embedding in keras
问题 I have trained word2vec in gensim. In Keras, I want to use it to make matrix of sentence using that word embedding. As storing the matrix of all the sentences is very space and memory inefficient. So, I want to make embedding layer in Keras to achieve this so that It can be used in further layers(LSTM). Can you tell me in detail how to do this? PS: It is different from other questions because I am using gensim for word2vec training instead of keras. 回答1: Let's say you have following data that