Tensorflow: Word2vec CBOW model

后端 未结 3 639
余生分开走
余生分开走 2020-12-30 07:25

I am new to tensorflow and to word2vec. I just studied the word2vec_basic.py which trains the model using Skip-Gram algorithm. Now I want to train using C

3条回答
  •  旧时难觅i
    2020-12-30 07:54

    I think CBOW model can not simply be achieved by flipping the train_inputs and the train_labels in Skip-gram because CBOW model architecture uses the sum of the vectors of surrounding words as one single instance for the classifier to predict. E.g., you should use [the, brown] together to predict quick rather than using the to predict quick.

    To implement CBOW, you'll have to write a new generate_batch generator function and sum up the vectors of surrounding words before applying logistic regression. I wrote an example you can refer to: https://github.com/wangz10/tensorflow-playground/blob/master/word2vec.py#L105

提交回复
热议问题