I am storing a large amount of small objects in IndexedDB. I would like to give the user the ability to export one of the object stores to a file that they can "download".
I have read this blog article. Which describes reading the data, JSON.stringify
ing the data, encoding it with encodeURIComponent
, and placing it as the href
for a link they can use to download the data. Something like this:
var transaction = db.transaction([objectstore], "readonly");
var content = [];
var objectStore = transaction.objectStore(objectstore);
objectStore.openCursor().onsuccess = function(event) {
var cursor = event.target.result;
if (cursor) {
content.push({key:cursor.key,value:cursor.value});
cursor.continue();
}
};
transaction.oncomplete = function(event) {
var serializedData = JSON.stringify(dataToStore);
link.attr("href",'data:Application/octet-stream,'+encodeURIComponent(serializedData));
link.trigger("click");
};
That is fine, except the object store will have millions of records and I don't feel that this will be performant enough. Is there a way to more directly allow the user to save an object store off as a file (in a way I can import again via the webpage).
Edit From some notes in the comments I rewrote a little of how this works, in order to get a little more juice out of it. The new code is similar to:
var transaction = db.transaction([objectstore], "readonly");
var objectStore = transaction.objectStore(objectstore);
objectStore.getAll().onsuccess = function(evt) {
var url = window.URL.createObjectURL(new Blob(evt.target.results, {'type': 'application/octet-stream'}));
link.attr('href', url);
link.trigger('click');
};
Which will give me results like:
- 10k records, 975.87ms average export time
- 100k records, 5,850.10ms average export time
- 1mil records, 56,681.00ms average export time
As you can see 1 million records takes about a minute to export. Is there any better way to be doing this? (I also tried using a cursor instead of .getAll()
, but cursors are slower)