Is it possible to write objects to a file during an IndexedDb transaction?

点点圈 提交于 2019-12-25 06:46:05

问题


I have an objectStore with hundreds of objects in it that I can view with code like this:

db // already set by call to window.indexedDB.open
.transaction(["mailing-list"])
.objectStore("mailing-list")
.openCursor()
.onsuccess = function (event) {
    var cursor = event.target.result;
    if (cursor) {
        console.log(cursor.value);
        cursor.continue();
    }
};

What I want to do is use a FileWriter to write out each object as it's retrieved; I don't want to accumulate all the objects in a giant array and write them all out at once. Nor do I want to start a separate transaction for each object, as I want to use a cursor to iterate through all the objects. (Without caring about their keys.)

I would put the call to fileWriter.write where the call to console.log is now. The call to cursor.continue() would be in the onwrite callback. Otherwise, I would be issuing a write before the previous one had completed, which is illegal.

This seems to be impossible, because, as writing is asynchronous, the "onsuccess" callback would return after write is called, thus ending the transaction and rendering the cursor invalid, even if it's captured in a closure.

Am I right about this? I'm about to code it to accumulate the entire collection of objects in memory, so I can write them after the cursor completes, although that's not my first choice.

(NOTE: I don't think there's a synchronous form of writing, but, even if there were, I couldn't use it as this code will be in a Chrome App.)


回答1:


You can do as long as you has enough buffer between the two async processors.

I have blog a bit how this is done here:

Consider for populating a store with records from a large delimited text. To avoid holding memory for holding all records, data are download by chunk using HTTP Range header by CsvStreamer and invoke next callback when it read a record. A chunk of data comprise multiple records and in that case CsvStreamer write synchronously. While waiting for next chunk of data CsvStreamer invoke next callback asynchronously.

var db = new ydn.db.Storage(db_name, schema, options);
var stream = new CsvStreamer(url);
var isSerial = false;
var tdb = db.thread('multi', isSerial); // multi-request parallel transactions
var putData = function(data) {
  if (data) { 
    tdb.put('store1', data).then(function() {
      stream.next(function (data) {   
        putData(data);
    }), function(e) {
      throw e;
    });     
  }
});

stream.next(function (data) {
  putData(data);
}

The trick is reuse the transaction if it is still active. Otherwise create a new transaction.

EDIT:

Yeah, above example is writing to idb, whereas you want reading. When piping async processors, you need to concern about the slowest one, that is FileWriter. Here is example:

var dump = function(marker) {
  var q = marker ? db.from('store name', '>', marker) : db.from('store name');
  q.open(function(cur) {
    var cb = writeToFile(cur.getValue());
    if (cb) { // wait me callback
       var next_key = cur.getKey();
       cb(function() {
         dump(next_key); // notice de-referencing cur, though not necessary
       });
       cb = null; // also not necessary, but better be on safe side of memory leak
       return null; // stop continuing cursor
    } else { // file writer said, good to call me again
      return undefined; // continue cursor
    }
  }, iter, 'readonly');
}

dump();

Is it possible to write objects to a file during an IndexedDb transaction?

Short answer: No.



来源:https://stackoverflow.com/questions/20362184/is-it-possible-to-write-objects-to-a-file-during-an-indexeddb-transaction

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!