what does the vector of a word in word2vec represents?
问题 word2vec is a open source tool by Google: For each word it provides a vector of float values, what exactly do they represent? There is also a paper on paragraph vector can anyone explain how they are using word2vec in order to obtain fixed length vector for a paragraph. 回答1: TLDR : Word2Vec is building word projections ( embeddings ) in a latent space of N dimensions, (N being the size of the word vectors obtained). The float values represents the coordinates of the words in this N