-1

I got a large file (300MB) that is saved in jsonlines. Means the file containes thousands of javascript objects, separated by line breaks.

Unfortunatelly I have no idea how to work with a file like this. Can anyone give me some kind of short introduction on how to handle such files?

Sinan Theuvsen
  • 193
  • 1
  • 3
  • 14
  • Is there anyway to break up that file into multiple files? That's a ton of data to store in browser memory and will likely crash the browser. – Chase DeAnda Jun 11 '18 at 14:14
  • 1
    https://stackoverflow.com/help/how-to-ask – Michael Jun 11 '18 at 14:14
  • http://jsonlines.org/. Stack Overflow isn't really a good fit for tutorials. Follow the examples on that site and if you run into a specific problem, ask a question about that. – Heretic Monkey Jun 11 '18 at 14:15
  • Look into a JSON streaming parser. Here is a good package that you can use on the client with JS: http://oboejs.com/examples#loading-json-trees-larger-than-the-available-ram – Chase DeAnda Jun 11 '18 at 14:21

2 Answers2

4

Don't use React/ don't handle it at the Client. In Node.JS, you can read such files directly (fs.readFileSync(...).split('\n').map((row) => JSON.parse(row))). If they are larger, use a line by line reader and JSON.parse(). Search at npm for that.

1

You are better off handling it on the server. Sending that much data to the client will definitely cause some hiccups on the client. An efficient way of handling such data is by using streams. So you could do something like this on the server:

const fs = require('fs');
const http = require('http');
const server = http.createServer((req,res) => {
    const readStream = fs.createReadStream('yourLargeFile.txt'); // create read stream
    readStream.pipe(res); // pass data to client as a writeable stream
}
server.listen(PORT, IP);

Read more here: Node stream documentation

Dave Kalu
  • 1,520
  • 3
  • 19
  • 38