0

I tried to read and parse (using nlohmann library) a Json file that is almost 90MB using this code:

std::ifstream ifs("data.json");
json jsonFile = json::parse(ifs);

The Json file to parse has almost 14000 array elements that need to be parsed. when I run this code the memory usage for my program goes up till it reach 2GB of memory! That is something very unexpected and huge for a small program like this.

How can I manage memory usage for this? Is it better to get file data using other methods and then pass it to library?

NutCracker
  • 11,485
  • 4
  • 44
  • 68
Ali Sepehri-Amin
  • 493
  • 1
  • 6
  • 18
  • @some programmer dude, well but not 2GB for a 90MB file... – Stefan Riedel Nov 05 '20 at 08:29
  • 1
    @StefanRiedel probably depends on how `nlohmann` library handles the data, but maybe it is a bit huge memory consumption. – NutCracker Nov 05 '20 at 08:31
  • By the way, how to do check the memory usage? Even if the process have 2 GiB mapped to it, doesn't mean all that memory is actually used. The OS could reclaim it if needed. – Some programmer dude Nov 05 '20 at 08:41
  • 1
    nlohmann has a SAX (event-based) interface: https://nlohmann.github.io/json/features/parsing/sax_interface/ . SAX parsers are harder to write, but you can process a big file this way by processing the data piecemeal and not holding all the contents of the file in memory at once. – parktomatomi Nov 05 '20 at 09:43
  • 1
    Aside, 90MB is not that big (multi-GB JSONs are not that uncommon), and nlohmann intentionally prioritizes ease of use over performance. A more efficient library like [sajson](https://github.com/chadaustin/sajson) or [simdjson](https://github.com/simdjson/simdjson) might use less RAM when storing the same document. – parktomatomi Nov 05 '20 at 10:29

0 Answers0