Using Stanford CoreNLP - Java heap space

穿精又带淫゛_ 提交于 2019-12-04 06:00:22

It seems the problem wasn't Stanford CoreNLP or Java, but Eclipse. Here's what I tried:

  • Checking if the arguments actually get passed through to the VM by printing System.out.println(Runtime.getRuntime().maxMemory());
  • Checking what arguments Eclipse uses by looking at what processes are running and what arguments they're using (ps -al)
  • It then turned out that Eclipse wasn't using the VM settings I specified.

  • When I checked with Runtime.getRuntime().maxMemory(), the output was 530186240 instead of 3203792896.
  • The arguments used by Eclipse were in the wrong order, i.e. -Xms1g -Xmx3g -Xmx256m -Xmx512m instead of -Xmx256m -Xmx512m -Xms1g -Xmx3g. Since only the last set of arguments are used, this explains why it's not working.
  • I then tried the following to fix it:

  • Don't override settings for a specific application, but rather override the default settings (see http://help.eclipse.org/juno/index.jsp?topic=%2Forg.eclipse.pde.doc.user%2Fguide%2Ftools%2Flaunchers%2Farguments.htm).
  • When that didn't work either, I tried to:

  • Manually edit the eclipse.ini file (add -vmargs -Xms1g -Xmx3g)
  • When that didn't work either, I re-installed Eclipse. Now everything works again: I can set default settings and override them for specific applications.

    易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
    该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!