How can I lemmatize a list of sentences in Python?
from nltk.stem.wordnet import WordNetLemmatizer
a = [\'i like cars\', \'cats are the best\']
lmtzr = WordN
You must lemmatize each word separately. Instead, you lemmatize sentences. Correct code fragment:
from nltk.stem.wordnet import WordNetLemmatizer
from nltk import word_tokenize
sents = ['i like cars', 'cats are the best']
lmtzr = WordNetLemmatizer()
lemmatized = [[lmtzr.lemmatize(word) for word in word_tokenize(s)]
for s in sents]
print(lemmatized)
#[['i', 'like', 'car'], ['cat', 'are', 'the', 'best']]
You can also get better results if you first do POS tagging and then provide the POS information to the lemmatizer.