I am having a file sized 15-16 GB containing \n separated JSON data (Approx 70-80 mil Data).
What will be the easiest way to read such large file in python without consuming much memory and also fast.
Also while reading such large file,If the script fails in between how to resume the reading from last read line using python?