I'm trying to open large .csv files (16k lines+, ~15 columns) in a python script, and am having some issues.
I use the built in open() function to open the file, then declare a csv.DictReader using the input file. The loop is structured like this:
for (i, row) in enumerate(reader):
# do stuff (send serial packet, read response)
However, if I use a file longer than about 20 lines, the file will open, but within a few iterations I get a ValueError: I/O operation on a closed file.
My thought is that I might be running out of memory (though the 16k line file is only 8MB, and I have 3GB of ram), in which case I expect I'll need to use some sort of buffer to load only sections of the file into memory at a time.
Am I on the right track? Or could there be other causes for the file closing unexpectedly?
edit: for about half the times I run this with a csv of 11 lines, it gives me the ValueError. The error does not always happen at the same line