Too Many open file exception while indexin using solr

我与影子孤独终老i 提交于 2019-12-20 15:39:12

问题


I am using SOLR for indexing documents in my web application and solr.war is deployed on the jboss server. But while indexing i am getting too many files open exception. Below is some of exceptions stack trace:

12:31:33,267 ERROR [STDERR] Exception in thread "Lucene Merge Thread #0"
12:31:33,267 ERROR [STDERR] org.apache.lucene.index.MergePolicy$MergeException: java.io.FileNotFoundException: /data/jbossesb/bin/solr/data/index/_2rw.prx (Too many open files)
12:31:33,267 ERROR [STDERR] at org.apache.lucene.index.ConcurrentMergeScheduler.handleMergeException(ConcurrentMergeScheduler.java:351)
12:31:33,267 ERROR [STDERR] at org.apache.lucene.index.ConcurrentMergeScheduler$MergeThread.run(ConcurrentMergeScheduler.java:315)
12:31:33,267 ERROR [STDERR] Caused by: java.io.FileNotFoundException: /data/jbossesb/bin/solr/data/index/_2rw.prx (Too many open files)
12:31:33,267 ERROR [STDERR] at java.io.RandomAccessFile.open(Native Method)

回答1:


File Descriptor will be your most likely cause.

Check the limit which your operating system has set. and adjust accordingliy. on Unix, the command to view and set is ulimit.




回答2:


As explained in this SOLR Jira, you can try the following options:

  • increasing your ulimit using: ulimit -n 1000000
  • set useCompoundFile to true in solrconfig.xml to use Lucene's compound file format
  • use a lower mergeFactor which will result in fewer segments and hence fewer open files.



回答3:


Optimize the index. It probably has too many segments.




回答4:


Also try reducing the merge factor



来源:https://stackoverflow.com/questions/3828343/too-many-open-file-exception-while-indexin-using-solr

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!