Can SQLite handle 90 million records?

前端 未结 8 2207
故里飘歌
故里飘歌 2020-12-04 10:50

Or should I use a different hammer to fix this problem.

I\'ve got a very simple use-case for storing data, effectively a sparse matrix, which I\'ve attempted to stor

8条回答
  •  轻奢々
    轻奢々 (楼主)
    2020-12-04 11:35

    I've looked at your code, and I think you might be overdoing it with the prepare and finalize statements. I am by no means an SQLite expert, but there's got to be significant overhead in preparing a statement each and every time through the loop.

    Quoting from the SQLite website:

    After a prepared statement has been evaluated by one or more calls to sqlite3_step(), it can be reset in order to be evaluated again by a call to sqlite3_reset(). Using sqlite3_reset() on an existing prepared statement rather creating a new prepared statement avoids unnecessary calls to sqlite3_prepare(). In many SQL statements, the time needed to run sqlite3_prepare() equals or exceeds the time needed by sqlite3_step(). So avoiding calls to sqlite3_prepare() can result in a significant performance improvement.

    http://www.sqlite.org/cintro.html

    In your case, rather than preparing a new statement each time, you could try binding new values to your existing statement.

    All this said, I think the indexes might be the actual culprit, since the time keeps increasing as you add more data. I am curious enough about this where I plan to do some testing over the weekend.

提交回复
热议问题