Why are word embedding actually vectors?

前端 未结 4 1427
臣服心动
臣服心动 2020-12-09 23:27

I am sorry for my naivety, but I don\'t understand why word embeddings that are the result of NN training process (word2vec) are actually vectors.

Embedding is the p

4条回答
  •  旧巷少年郎
    2020-12-10 00:32

    Each word is mapped to a point in d-dimension space (d is usually 300 or 600 though not necessary), thus its called a vector (each point in d-dim space is nothing but a vector in that d-dim space).

    The points have some nice properties (words with similar meanings tend to occur closer to each other) [proximity is measured using cosine distance between 2 word vectors]

提交回复
热议问题