I am storing huge csv file as list of dictionaries like below:
dictlist=[{ 'col-1' : 'data-1','col-2' : 'data-2','col-3' : 'data-3'},
{ 'col-1' : 'data-1','col-2' : 'data-2','col-3' : 'data-3'}]
where keys 1 and 2 are row numbers and value is data in csv file.
My problem is I need to convert csv file having like 4 million+ rows. But for that huge amount of data my data structure is not able to fit in the memory (RAM).
can someone help me to find a solution like storing the data structure to disk or something. I need to use the entire data structure at a time.
Thanks in advance