random-access

Compression formats with good support for random access within archives?

家住魔仙堡 提交于 2019-11-26 07:20:59
问题 This is similar to a previous question, but the answers there don\'t satisfy my needs and my question is slightly different: I currently use gzip compression for some very large files which contain sorted data. When the files are not compressed, binary search is a handy and efficient way to support seeking to a location in the sorted data. But when the files are compressed, things get tricky. I recently found out about zlib\'s Z_FULL_FLUSH option, which can be used during compression to

SQLite - ORDER BY RAND()

陌路散爱 提交于 2019-11-26 06:34:22
问题 In MySQL I can use the RAND() function, is there any alternative in SQLite 3? 回答1: using random(): SELECT foo FROM bar WHERE id >= (abs(random()) % (SELECT max(id) FROM bar)) LIMIT 1; EDIT (by QOP): Since the docs on SQLite Autoincremented columns states that: The normal ROWID selection algorithm described above will generate monotonically increasing unique ROWIDs as long as you never use the maximum ROWID value and you never delete the entry in the table with the largest ROWID. If you ever

quick random row selection in Postgres

强颜欢笑 提交于 2019-11-26 03:07:56
问题 I have a table in postgres that contains couple of millions of rows. I have checked on the internet and I found the following SELECT myid FROM mytable ORDER BY RANDOM() LIMIT 1; it works, but it\'s really slow... is there another way to make that query, or a direct way to select a random row without reading all the table? by the way \'myid\' is an integer but it can be an empty field. thanks 回答1: You might want to experiment with OFFSET , as in SELECT myid FROM mytable OFFSET floor(random()*N