2
this.papa.parse(this.inputFile, {
      header: false,
      dynamicTyping: true,
      skipEmptyLines: true,
      error: (error) => {
        console.log('parse_error: ', error);
      },
      complete: (result) => {
        data = result.data;
        ...

      }
    });

I parse a local csv file(1.02GB) but can't get the row's content. when parse the small file then everything is right

Kaiido
  • 123,334
  • 13
  • 219
  • 285
  • What's papa? You should add it as a tag if available for more visibility. – Phix Jan 15 '21 at 00:15
  • I would have assumed there is no other limit than your devices RAM. They are supposed to read this file by chunks of 24MB, as to avoid any max-string-length – Kaiido Jan 15 '21 at 01:30

1 Answers1

2

You should use chunks. The documentation:

Can Papa load and parse huge files? Yes. Parsing huge text files is facilitated by streaming, where the file is loaded a little bit at a time, parsed, and the results are sent to your step callback function, row-by-row. You can also get results chunk-by-chunk (which is usually faster) by using the chunk callback function in the same way. https://www.papaparse.com/docs

As for the limit size that papaparse can handle - I'm not sure.

But I did tests on 2 machines with 45mb chunk size (tried 20, 50, 100 etc.)

  • Old pc (amd a8 5600k) with 8gb ram + 8gb swap memory
  • Macbook pro mid 2015 base model - with 16gb memory

Tried to parse 4gb file.

 chunkSize: 1024 * 1024 * 45

Results for Chrome and Opera

Both failed at 1.8 - 1.9 gb.

Results for FireFox

Both successfully parsed 4gb file.

Gang223
  • 21
  • 2