Why does Lucene cause OOM when indexing large files?

后端 未结 5 589
一向
一向 2021-01-13 05:01

I’m working with Lucene 2.4.0 and the JVM (JDK 1.6.0_07). I’m consistently receiving OutOfMemoryError: Java heap space, when trying to index large text files.<

5条回答
  •  庸人自扰
    2021-01-13 05:53

    You can set the IndexWriter to flush based on memory usage or # of documents - I would suggest setting it to flsuh based on memory and seeing if this fixes your issue. My guess is your entire index is living in memory because you never flush it to disk.

提交回复
热议问题