StanfordNLP classifier out of memory error
问题 I'm using StanfordNLP in order to classify some text. It works fine when I use train files with a maximum of 160K lines. But, if I use larger ones, I receive a java.lang.OutOfMemoryError: Java heap space . I'm using the following properties: e.s.n.c.ColumnDataClassifier - Setting ColumnDataClassifier properties e.s.n.c.ColumnDataClassifier - 1.useAllSplitWordTriples = true e.s.n.c.ColumnDataClassifier - useQN = true e.s.n.c.ColumnDataClassifier - encoding = utf-8 e.s.n.c.ColumnDataClassifier