7

I want to load a large file line by line in Nim. I tried the following code snippet:

for line in lines "largefile.txt":
  echo line

However, this loads the entire file largefile.txt into memory which is not feasible when the file is very large > 30GB.

How can I iterate over a large file while holding only a single line in memory ?

Pythtotic
  • 415
  • 6
  • 10
  • I'm not sure how that would happen, unless you import a non-standard `lines` iterator somewhere. The one [in lib/system.nim](https://github.com/nim-lang/Nim/blob/09b6d8c0ca5f9b5590e58d90b987975f36df8dd6/lib/system.nim#L3044) reads the file line by line, just as you want it. And when I'm testing that with a large file, memory usage does remain constant, as expected. – Reimer Behrends Dec 30 '16 at 15:09

1 Answers1

6

Indeed, Reimer Behrends is right. The lines function works as expected.

The issue was that my file had only escaped newline characters. As a result, Nim (correctly) reads the file as one large line.

Pythtotic
  • 415
  • 6
  • 10