How to lemmatize a list of sentences

[亡魂溺海] 提交于 2020-01-03 02:40:07

问题


How can I lemmatize a list of sentences in Python?

from nltk.stem.wordnet import WordNetLemmatizer
a = ['i like cars', 'cats are the best']
lmtzr = WordNetLemmatizer()
lemmatized = [lmtzr.lemmatize(word) for word in a]
print(lemmatized)

This is what I've tried but it gives me the same sentences. Do I need to tokenize the words before to work properly?


回答1:


TL;DR:

pip3 install -U pywsd

Then:

>>> from pywsd.utils import lemmatize_sentence

>>> text = 'i like cars'
>>> lemmatize_sentence(text)
['i', 'like', 'car']
>>> lemmatize_sentence(text, keepWordPOS=True)
(['i', 'like', 'cars'], ['i', 'like', 'car'], ['n', 'v', 'n'])

>>> text = 'The cat likes cars'
>>> lemmatize_sentence(text, keepWordPOS=True)
(['The', 'cat', 'likes', 'cars'], ['the', 'cat', 'like', 'car'], [None, 'n', 'v', 'n'])

>>> text = 'The lazy brown fox jumps, and the cat likes cars.'
>>> lemmatize_sentence(text)
['the', 'lazy', 'brown', 'fox', 'jump', ',', 'and', 'the', 'cat', 'like', 'car', '.']

Otherwise, take a look at how the function in pywsd:

  • Tokenize the string
  • Uses the POS tagger and maps to WordNet POS tagset
  • Attempts to stem
  • Finally calling the lemmatizer with the POS and/or stems

See https://github.com/alvations/pywsd/blob/master/pywsd/utils.py#L129




回答2:


You must lemmatize each word separately. Instead, you lemmatize sentences. Correct code fragment:

from nltk.stem.wordnet import WordNetLemmatizer
from nltk import word_tokenize
sents = ['i like cars', 'cats are the best']
lmtzr = WordNetLemmatizer()
lemmatized = [[lmtzr.lemmatize(word) for word in word_tokenize(s)]
              for s in sents]
print(lemmatized)
#[['i', 'like', 'car'], ['cat', 'are', 'the', 'best']]

You can also get better results if you first do POS tagging and then provide the POS information to the lemmatizer.



来源:https://stackoverflow.com/questions/50685343/how-to-lemmatize-a-list-of-sentences

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!