Using Stanford CoreNLP

空扰寡人 提交于 2019-12-21 04:32:13

问题


I am trying to get around using the Stanford CoreNLP. I used some code from the web to understand what is going on with the coreference tool. I tried running the project in Eclipse but keep encountering an out of memory exception. I tried increasing the heap size but there isnt any difference. Any ideas on why this keeps happening? Is this a code specific problem? Any directions of using CoreNLP would be awesome.

EDIT - Code Added

import edu.stanford.nlp.dcoref.CorefChain;
import edu.stanford.nlp.dcoref.CorefCoreAnnotations;
import edu.stanford.nlp.pipeline.Annotation;
import edu.stanford.nlp.pipeline.StanfordCoreNLP;


import java.util.Iterator;
import java.util.Map;
import java.util.Properties;


public class testmain {

    public static void main(String[] args) {

        String text = "Viki is a smart boy. He knows a lot of things.";
        Annotation document = new Annotation(text);
        Properties props = new Properties();
        props.put("annotators", "tokenize, ssplit, pos, parse, dcoref");
        StanfordCoreNLP pipeline = new StanfordCoreNLP(props);
        pipeline.annotate(document);


        Map<Integer, CorefChain> graph = document.get(CorefCoreAnnotations.CorefChainAnnotation.class);



        Iterator<Integer> itr = graph.keySet().iterator();

        while (itr.hasNext()) {

             String key = itr.next().toString();

             String value = graph.get(key).toString();

             System.out.println(key + " " + value);      
        }

   }
}

回答1:


I found similar problem when building small application using Stanford CoreNLP in Eclipse.
Increasing Eclipse's heap size will not solve your problem.
After doing search, it is ant build tool heap size that should be increased, but I have no idea how to do that.
So I give up Eclipse and use Netbeans instead.

PS: You will eventually get out of memory exception with default setting in Netbeans. But it can easily solved by adjust setting -Xms per application basis.




回答2:


Fix for eclipse: You can configure this in eclipse preference as follows

  1. Windows -> Preferences ( on mac it's: eclipse ->preferences)
  2. Java -> Installed JREs
  3. Select the JRE and click on Edit
  4. On the default VM arguments field, type in "-Xmx1024M". (or your memory preference, for 1GB of ram its 1024)
  5. Click on finish or OK.



回答3:


I think you can define the heap size in right-click->run->run-configurations under the VM arguments. i have tested it on mac and it works.



来源:https://stackoverflow.com/questions/8967544/using-stanford-corenlp

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!