I have a JSON file, about 800MB and no way I can load that into memory, even with FileReader.readAsText
(the result property is an empty string). I don't think this is relevant but the JSON file is an array of about 3.5 millions small objects. Note that the file is picked by user in browser and never leaves browser. All processing is in browser.
I tried Oboe.js and could stream the input in but it stops after a while. Looking at the event, I guess Oboe is storing all the JSON objects it parse. Their browser version doesn't support stream as well if I understand correctly.
Is there anyway to forward reading JSON? I don't mind not having the previous state, similar to .NET Utf8JsonReader?
Here's my current attempt with Oboe:
async loadAsync(file: Blob) {
const blocks: any[] = this.blocks = [];
await new Promise<void>(r => {
const ms = Date.now();
const url = URL.createObjectURL(file);
let counter = 0;
oboe(url)
.node("!.[*]", (block: any) => {
blocks.push(block);
counter++;
if (counter % 5000 == 0) {
this.logFn(`Loading: ${counter} items so far.`);
}
})
.done((fullJson: any) => {
debugger;
this.logFn(`Finished loading ${counter} blocks in ${Date.now() - ms}ms`);
r();
})
.fail((err: any) => {
console.error(err);
this.logFn(err);
});
});
}
This works well for small files but for big file after about 250k items, there is this error: