How does spacy use word embeddings for Named Entity Recognition (NER)?

99封情书 提交于 2020-01-11 18:52:27

问题


I'm trying to train an NER model using spaCy to identify locations, (person) names, and organisations. I'm trying to understand how spaCy recognises entities in text and I've not been able to find an answer. From this issue on Github and this example, it appears that spaCy uses a number of features present in the text such as POS tags, prefixes, suffixes, and other character and word-based features in the text to train an Averaged Perceptron.

However, nowhere in the code does it appear that spaCy uses the GLoVe embeddings (although each word in the sentence/document appears to have them, if present in the GLoVe corpus).

My questions are -

  1. Are these used in the NER system now?
  2. If I were to switch out the word vectors to a different set, should I expect performance to change in a meaningful way?
  3. Where in the code can I find out how (if it all) spaCy is using the word vectors?

I've tried looking through the Cython code, but I'm not able to understand whether the labelling system uses word embeddings.


回答1:


spaCy does use word embeddings for its NER model, which is a multilayer CNN. There's a quite a nice video that Matthew Honnibal, the creator of spaCy made, about how its NER works here. All three English models use GloVe vectors trained on Common Crawl, but the smaller models "prune" the number of vectors by having similar words mapped to the same vector link.

It's quite doable to add custom vectors. There's an overview of the process in the spaCy docs, plus some example code on Github.



来源:https://stackoverflow.com/questions/44492430/how-does-spacy-use-word-embeddings-for-named-entity-recognition-ner

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!