Using Stanford CoreNLP

最后都变了- 提交于 2019-12-03 13:18:01

I found similar problem when building small application using Stanford CoreNLP in Eclipse.
Increasing Eclipse's heap size will not solve your problem.
After doing search, it is ant build tool heap size that should be increased, but I have no idea how to do that.
So I give up Eclipse and use Netbeans instead.

PS: You will eventually get out of memory exception with default setting in Netbeans. But it can easily solved by adjust setting -Xms per application basis.

user1374131

Fix for eclipse: You can configure this in eclipse preference as follows

  1. Windows -> Preferences ( on mac it's: eclipse ->preferences)
  2. Java -> Installed JREs
  3. Select the JRE and click on Edit
  4. On the default VM arguments field, type in "-Xmx1024M". (or your memory preference, for 1GB of ram its 1024)
  5. Click on finish or OK.

I think you can define the heap size in right-click->run->run-configurations under the VM arguments. i have tested it on mac and it works.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!