Resource u'tokenizers/punkt/english.pickle' not found

前端 未结 17 2051
粉色の甜心
粉色の甜心 2020-12-13 01:49

My Code:

import nltk.data
tokenizer = nltk.data.load(\'nltk:tokenizers/punkt/english.pickle\')

ERROR Message:

[ec2-user@ip-         


        
17条回答
  •  -上瘾入骨i
    2020-12-13 02:20

    For me nothing of the above worked, so I just downloaded all the files by hand from the web site http://www.nltk.org/nltk_data/ and I put them also by hand in a file "tokenizers" inside of "nltk_data" folder. Not a pretty solution but still a solution.

提交回复
热议问题