Limiting the number of records in a Sqlite DB

前端 未结 1 989
不思量自难忘°
不思量自难忘° 2020-12-17 03:17

What I\'m trying to implement here is a condition wherein a sqlite database holds only the most recent 1000 records. I have timestamps with each record. One of the ineffici

相关标签:
1条回答
  • 2020-12-17 04:01

    You can use an implicit "rowid" column for that.

    Assuming you don't delete rows manually in different ways:

    DELETE FROM yourtable WHERE rowid < (last_row_id - 1000)
    

    You can obtain last rowid using API function or as max(rowid)

    If you don't need to have exactly 1000 records (e.g. just want to cleanup old records), it is not necessary to do it on each insert. Add some counter in your program and execute cleanup f.i. once every 100 inserts.

    UPDATE:

    Anyway, you pay performance either on each insert or on each select. So the choice depends on what you have more: INSERTs or SELECTs.

    In case you don't have that much inserts to care about performance, you can use following trigger to keep not more than 1000 records:

    CREATE TRIGGER triggername AFTER INSERT ON tablename BEGIN
         DELETE FROM tablename WHERE timestamp < (SELECT MIN(timestamp) FROM tablename ORDER BY timestamp DESC LIMIT 1000);
    END
    

    Creating unique index on timestamp column should be a good idea too (in case it isn't PK already). Also note, that SQLITE supports only FOR EACH ROW triggers, so when you bulk-insert many records it is worth to temporary disable the trigger.

    If there are too many INSERTs, there isn't much you can do on database side. You can achieve less frequent trigger calls by adding trigger condition like AFTER INSERT WHEN NEW.rowid % 100 = 0. And with selects just use LIMIT 1000 (or create appropriate view).

    I can't predict how much faster that would be. The best way would be just measure how much performance you will gain in your particular case.

    0 讨论(0)
提交回复
热议问题