Lucene / Hibernate Search Lock Exception

雨燕双飞 提交于 2020-01-13 10:17:11

问题


I use Hibernate Search to index and full-text search items on a web application, problem-less!

From my pom.xml:

<hibernate.search.version>3.4.2.Final</hibernate.search.version>
<apache.lucene.version>3.6.2</apache.lucene.version>
<apache.solr.version>3.6.2</apache.solr.version>
<hibernate.version>3.6.9.Final</hibernate.version>

Now, before going to production I tried to stress test the search feature of my web-application using Apache JMeter. When testing with more then one thread, I receive tons of following Exception:

17:11:57,670 ERROR LogErrorHandler:82 - Exception occurred org.apache.lucene.store.LockObtainFailedException: Lock obtain timed out: SimpleFSLock@/opt/myapp/item_index/myapp.item.domain.Item/write.lock
Primary Failure:
    Entity myapp.item.domain.Item  Id 4  Work Type  org.hibernate.search.backend.DeleteLuceneWork
org.apache.lucene.store.LockObtainFailedException: Lock obtain timed out: SimpleFSLock@/opt/myapp/item_index/myapp.item.domain.Item/write.lock
    at org.apache.lucene.store.Lock.obtain(Lock.java:84)
    at org.apache.lucene.index.IndexWriter.<init>(IndexWriter.java:1098)
    at org.hibernate.search.backend.Workspace.createNewIndexWriter(Workspace.java:202)
    at org.hibernate.search.backend.Workspace.getIndexWriter(Workspace.java:180)
    at org.hibernate.search.backend.impl.lucene.PerDPQueueProcessor.run(PerDPQueueProcessor.java:103)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
    at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
    at java.util.concurrent.FutureTask.run(FutureTask.java:138)
    at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
    at java.lang.Thread.run(Thread.java:662)
17:11:57,670 ERROR PerDPQueueProcessor:118 - Unexpected error in Lucene Backend: 
org.hibernate.search.SearchException: Unable to remove class myapp.item.domain.Item#4 from index.
    at org.hibernate.search.backend.impl.lucene.works.DeleteExtWorkDelegate.performWork(DeleteExtWorkDelegate.java:77)
    at org.hibernate.search.backend.impl.lucene.PerDPQueueProcessor.run(PerDPQueueProcessor.java:106)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
    at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
    at java.util.concurrent.FutureTask.run(FutureTask.java:138)
    at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
    at java.lang.Thread.run(Thread.java:662)
Caused by: java.lang.NullPointerException
    at org.hibernate.search.backend.impl.lucene.works.DeleteExtWorkDelegate.performWork(DeleteExtWorkDelegate.java:72)
    ... 7 more

My index creation is done using the following function (called from a @Transactional method):

@Override
public void createInitialIndexFromDB() {
    // get session and set flush mode to manually to control the commit
    FullTextSession fullTextSession = getFullTextSession();
    fullTextSession.setFlushMode(FlushMode.MANUAL);
    // do not add any data to the object context
    fullTextSession.setCacheMode(CacheMode.IGNORE);

    addResultsToIndex(fullTextSession, FETCH_ITEMS_TO_INDEX);
    addResultsToIndex(fullTextSession, FETCH_DRAFTS_TO_INDEX);
    addResultsToIndex(fullTextSession, FETCH_RESERVATIONS_TO_INDEX);
    addResultsToIndex(fullTextSession, FETCH_SALES_TO_INDEX);

    fullTextSession.flushToIndexes();
    fullTextSession.clear();
}

private void addResultsToIndex(FullTextSession fullTextSession, String query) {
    ScrollableResults results = fullTextSession.createQuery(query).scroll(
            ScrollMode.FORWARD_ONLY);
    for (int index = 1; results.next(); index++) {
        fullTextSession.index(results.get(0));
        if (index % BATCH_SIZE == 0 || results.isLast()) {
            fullTextSession.flushToIndexes();
            fullTextSession.clear();
        }
    }
}

private FullTextSession getFullTextSession() {
    Session session = this.sessionFactory.getCurrentSession();
    return Search.getFullTextSession(session);
}

After index creation all changes to my indexes are going through a custom FullTextIndexEventListener:

public final class HibernateItemEventListener extends
        FullTextIndexEventListener {

    private static final Logger log = LoggerFactory
            .getLogger(HibernateItemEventListener.class);

    public HibernateItemEventListener() {
        super(Installation.SINGLE_INSTANCE);
    }

    @Override
    public void onPostInsert(PostInsertEvent event) {

        log.debug("onPostInsert");
        if (!isIndexed(event.getEntity()))
            return;

        // Without these checks the elements are added twice to the index!
        if (event.getEntity() instanceof ItemReservation) 
            return;

        if (event.getEntity() instanceof ItemSale)
            return;

        super.onPostInsert(event);
    }

    @Override
    public void onPostUpdate(PostUpdateEvent event) {
        log.debug("onPostUpdate - Start");
        if (!isIndexed(event.getEntity()))
            return;

        Serializable id = event.getId();
        log.debug("onPostUpdate - Need update for id " + id);

        if (used) {
            boolean identifierRollbackEnabled = event.getSession().getFactory()
                    .getSettings().isIdentifierRollbackEnabled();
            final Object entity = event.getEntity();
            if (searchFactoryImplementor.getDocumentBuilderIndexedEntity(entity
                    .getClass()) != null
                    || searchFactoryImplementor
                            .getDocumentBuilderContainedEntity(entity
                                    .getClass()) != null) {

                // Remove item
                if (entity instanceof Item) {
                    Item item = (Item) entity;
                    if (item.getQuantity() < 1) {
                        processWork(entity, id, WorkType.PURGE, event,
                                identifierRollbackEnabled);
                        return;
                    }
                }

                // Remove reservation
                if (entity instanceof ItemReservation) {
                    ItemReservation ir = (ItemReservation) entity;
                    if (ir.getActive() < 1) {
                        processWork(entity, id, WorkType.PURGE, event,
                                identifierRollbackEnabled);
                        return;
                    }
                }

                // Update entity
                processWork(entity, id, WorkType.UPDATE, event,
                        identifierRollbackEnabled);
            } else {
                // Add entity
                processWork(entity, id, WorkType.ADD, event,
                        identifierRollbackEnabled);
            }
        }
    }

    @Override
    public void onPostDelete(PostDeleteEvent event) {
        log.debug("onPostDelete - Start");
        if (!isIndexed(event.getEntity()))
            return;
        log.debug("onPostDelete - Need delete for id " + event.getId());
        super.onPostDelete(event);
    }

    private boolean isIndexed(Object entity) {
        return entity instanceof Item || entity instanceof Draft
                || entity instanceof ItemReservation
                || entity instanceof ItemSale;
    }
}

The exception above doesn't affect the application itself (the search does work), but it sometimes lead to another more critical exception (which I suppose is related with the lock problem):

17:11:58,866 ERROR LogErrorHandler:82 - Exception occurred java.io.FileNotFoundException: _iz.fdx
java.io.FileNotFoundException: _iz.fdx
    at org.apache.lucene.store.FSDirectory.fileLength(FSDirectory.java:284)
    at org.apache.lucene.index.SegmentInfo.sizeInBytes(SegmentInfo.java:303)
    at org.apache.lucene.index.LogMergePolicy.sizeBytes(LogMergePolicy.java:193)
    at org.apache.lucene.index.LogByteSizeMergePolicy.size(LogByteSizeMergePolicy.java:45)
    at org.apache.lucene.index.LogMergePolicy.useCompoundFile(LogMergePolicy.java:147)
    at org.apache.lucene.index.DocumentsWriter.flush(DocumentsWriter.java:593)
    at org.apache.lucene.index.IndexWriter.doFlush(IndexWriter.java:3587)
    at org.apache.lucene.index.IndexWriter.prepareCommit(IndexWriter.java:3376)
    at org.apache.lucene.index.IndexWriter.commitInternal(IndexWriter.java:3485)
    at org.apache.lucene.index.IndexWriter.commit(IndexWriter.java:3467)
    at org.apache.lucene.index.IndexWriter.commit(IndexWriter.java:3451)
    at org.hibernate.search.backend.Workspace.commitIndexWriter(Workspace.java:220)
    at org.hibernate.search.backend.impl.lucene.PerDPQueueProcessor.run(PerDPQueueProcessor.java:109)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
    at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
    at java.util.concurrent.FutureTask.run(FutureTask.java:138)
    at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
    at java.lang.Thread.run(Thread.java:662)
17:11:59,991 ERROR LogErrorHandler:82 - Exception occurred java.io.FileNotFoundException: /opt/myapp/item_index/myapp.item.domain.Item/_iz.cfs (No such file or directory)

Any Hints?

EDIT: After library adjustment and JMeter aggressiveness increased (more threads) the exception and index corruption still appears:

12:08:11,163 ERROR LogErrorHandler:82 - Exception occurred java.io.FileNotFoundException: /opt/myapp/item_index/myapp.item.domain.Item/_8gy.cfs (No such file or directory)
Primary Failure:
    Entity myapp.item.domain.Item  Id 596  Work Type  org.hibernate.search.backend.DeleteLuceneWork
Subsequent failures:
    Entity myapp.item.domain.Item  Id 596  Work Type  org.hibernate.search.backend.AddLuceneWork
    Entity myapp.item.domain.Item  Id 734  Work Type  org.hibernate.search.backend.DeleteLuceneWork
    Entity myapp.item.domain.Item  Id 734  Work Type  org.hibernate.search.backend.AddLuceneWork
    Entity myapp.item.domain.Item  Id 599  Work Type  org.hibernate.search.backend.DeleteLuceneWork
    Entity myapp.item.domain.Item  Id 599  Work Type  org.hibernate.search.backend.AddLuceneWork
    Entity myapp.item.domain.Item  Id 735  Work Type  org.hibernate.search.backend.DeleteLuceneWork
    Entity myapp.item.domain.Item  Id 735  Work Type  org.hibernate.search.backend.AddLuceneWork
    Entity myapp.item.domain.Item  Id 598  Work Type  org.hibernate.search.backend.DeleteLuceneWork
    Entity myapp.item.domain.Item  Id 598  Work Type  org.hibernate.search.backend.AddLuceneWork
    Entity myapp.item.domain.Item  Id 720  Work Type  org.hibernate.search.backend.DeleteLuceneWork
    Entity myapp.item.domain.Item  Id 720  Work Type  org.hibernate.search.backend.AddLuceneWork

java.io.FileNotFoundException: /opt/myapp/item_index/myapp.item.domain.Item/_8gy.cfs (No such file or directory)
    at java.io.RandomAccessFile.open(Native Method)
    at java.io.RandomAccessFile.<init>(RandomAccessFile.java:216)
    at org.apache.lucene.store.SimpleFSDirectory$SimpleFSIndexInput$Descriptor.<init>(SimpleFSDirectory.java:69)
    at org.apache.lucene.store.SimpleFSDirectory$SimpleFSIndexInput.<init>(SimpleFSDirectory.java:90)
    at org.apache.lucene.store.NIOFSDirectory$NIOFSIndexInput.<init>(NIOFSDirectory.java:91)
    at org.apache.lucene.store.NIOFSDirectory.openInput(NIOFSDirectory.java:78)
    at org.apache.lucene.index.CompoundFileReader.<init>(CompoundFileReader.java:66)
    at org.apache.lucene.index.CompoundFileReader.<init>(CompoundFileReader.java:55)
    at org.apache.lucene.index.IndexWriter.getFieldInfos(IndexWriter.java:1193)
    at org.apache.lucene.index.IndexWriter.getCurrentFieldInfos(IndexWriter.java:1213)
    at org.apache.lucene.index.IndexWriter.<init>(IndexWriter.java:1149)
    at org.hibernate.search.backend.Workspace.createNewIndexWriter(Workspace.java:202)
    at org.hibernate.search.backend.Workspace.getIndexWriter(Workspace.java:180)
    at org.hibernate.search.backend.impl.lucene.PerDPQueueProcessor.run(PerDPQueueProcessor.java:103)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
    at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
    at java.util.concurrent.FutureTask.run(FutureTask.java:138)
    at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
    at java.lang.Thread.run(Thread.java:662)
12:08:11,163 ERROR PerDPQueueProcessor:118 - Unexpected error in Lucene Backend: 
org.hibernate.search.SearchException: Unable to remove class myapp.item.domain.Item#596 from index.
    at org.hibernate.search.backend.impl.lucene.works.DeleteExtWorkDelegate.performWork(DeleteExtWorkDelegate.java:77)
    at org.hibernate.search.backend.impl.lucene.PerDPQueueProcessor.run(PerDPQueueProcessor.java:106)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
    at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
    at java.util.concurrent.FutureTask.run(FutureTask.java:138)
    at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
    at java.lang.Thread.run(Thread.java:662)
Caused by: java.lang.NullPointerException

EDIT After debugging: FileNotFound (and therefore IndexWriter NPE) generated at

IndexWriter writer = new IndexWriter( directoryProvider.getDirectory(), writerConfig );

in Workspace.createNewIndexWriter() method.

writerConfig:
matchVersion=LUCENE_31
analyzer=org.apache.lucene.analysis.SimpleAnalyzer
delPolicy=org.apache.lucene.index.KeepOnlyLastCommitDeletionPolicy
commit=null
openMode=CREATE_OR_APPEND
similarity=org.apache.lucene.search.DefaultSimilarity
termIndexInterval=128
mergeScheduler=org.hibernate.search.backend.impl.lucene.overrides.ConcurrentMergeScheduler
default WRITE_LOCK_TIMEOUT=1000
writeLockTimeout=1000
maxBufferedDeleteTerms=-1
ramBufferSizeMB=16.0
maxBufferedDocs=-1
mergedSegmentWarmer=null
mergePolicy=[LogByteSizeMergePolicy: minMergeSize=1677721, mergeFactor=10, maxMergeSize=2147483648, maxMergeSizeForOptimize=9223372036854775807, calibrateSizeByDeletes=true, maxMergeDocs=2147483647, useCompoundFile=true]
maxThreadStates=8
readerPooling=false
readerTermsIndexDivisor=1

回答1:


Hibernate Search is highly coupled to Lucene's API changes, the Lucene version you're using is quite different from the one used during development of Hibernate Search 3.4.2.

These are the recommended versions, I'm reading them from the pom.xml from the 3.4.2.Final tag:

<hibernate.search.version>3.4.2.Final</hibernate.search.version>
<apache.lucene.version>3.1.0</apache.lucene.version>
<apache.solr.version>3.1.0</apache.solr.version>
<hibernate.version>3.6.10.Final</hibernate.version>



回答2:


from hibernate search forum thread

I think I solved it.

The problem was not the Hibernate Search itself, it was Atomikos which killed threads after the concurrent transaction threshold has been exceeded (probably leaving the file descriptor open). I increased the concurrent transaction limit (it was 50) and the problem was gone.

Despite this, I still don't understand why is hibernate search opening an index writer each time I perform a lucene query.



来源:https://stackoverflow.com/questions/18085526/lucene-hibernate-search-lock-exception

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!