0

I am storing huge csv file as list of dictionaries like below:

dictlist=[{ 'col-1' : 'data-1','col-2' : 'data-2','col-3' : 'data-3'},
           { 'col-1' : 'data-1','col-2' : 'data-2','col-3' : 'data-3'}]

where keys 1 and 2 are row numbers and value is data in csv file.

My problem is I need to convert csv file having like 4 million+ rows. But for that huge amount of data my data structure is not able to fit in the memory (RAM).

can someone help me to find a solution like storing the data structure to disk or something. I need to use the entire data structure at a time.

Thanks in advance

p.durga shankar
  • 967
  • 8
  • 18

1 Answers1

0
  • I have solved this by using disklist library.
  • First you need to install disklist library using pip (pip install disklist)
  • After that you should import disklist and use it like list (some_variable_name = disklist())
  • The disklist is almost same as list in python. But it will be stored in disk space rather than RAM.
p.durga shankar
  • 967
  • 8
  • 18