R's coreNLP::initCoreNLP() throws java.lang.OutOfMemoryError

☆樱花仙子☆ 提交于 2020-01-04 05:54:09

问题


coreNLP is an R package for interfacing with Standford's CoreNLP Java libraries. The first line one must execute (after loading the appropriate packages with the library() command) is initCoreNLP(). Unfortunately, this results in the following error:

Loading classifier from edu/stanford/nlp/models/ner/english.conll.4class.distsim.crf.ser.gz ... Error in rJava::.jnew("edu.stanford.nlp.pipeline.StanfordCoreNLP", basename(path)) : java.lang.OutOfMemoryError: GC overhead limit exceeded

Note, this is the same problem that is listed here: (initCoreNLP() method call from the Stanford's R coreNLP package throws error). In that case, however, the OP found that rebooting his machine made the problem disappear. This is not the case for me; I keep experiencing it even after a reboot.

Has anyone else run into this and can provide a solution or suggestion?

Thanks in advance, DG

CONFIG DETAILS:

R version 3.2.3 (2015-12-10)

rJava version 0.9-7

coreNLP version 0.4-1

Machine: Win 7 with 8GB RAM


回答1:


Here is some documentation I found:

https://cran.r-project.org/web/packages/coreNLP/coreNLP.pdf

(specifically page 7)

You can specify how much memory you use (from the documentation):

initCoreNLP(libLoc, parameterFile, mem = "4g", annotators)

Add more memory and I would imagine the problem will go away.



来源:https://stackoverflow.com/questions/34983149/rs-corenlpinitcorenlp-throws-java-lang-outofmemoryerror

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!