0

I'm trying to write a large (~ 20GB) .fvp file, which behaves like a delimited text file to a matlab array and save it as a .mat with python 3. Right now, I am reading the file in python, converting the values to a single numpy array and saving it with scipy.io.savemat().

However, my pc runs out of memory in the process, which I think is due to the large size of the numpy array since my code runs okay for smaller .fvp files.

To solve this problem, I want to write and save sections of the .fvp file in multiple .mat files and join them up later, preferably in python. Is there a way to do it? I can't find it in scipy.io.

Cris Luengo
  • 55,762
  • 10
  • 62
  • 120
ag3nt
  • 1
  • Do you mean join the MAT-files on disk? How would that work? In MATLAB you can write to a MAT-file incrementally. I would recommend that route. If you use Python only, why use MAT-files? – Cris Luengo Dec 01 '19 at 13:34
  • I don't have MATLAB on my current computer and writing the .mat files takes a long time. So to save time I was thinking of writing the .mat files with python on this computer and then transferring them to a faculty computer that does have MATLAB installed for analysis. My idea was to write `mat1.mat`, `mat2.mat` ... and join them together with python if possible. – ag3nt Dec 01 '19 at 18:17
  • Why not keep the files with partial matrices and concatenate them in MATLAB on the faculty machine, which I presume has more memory than yours? – Cris Luengo Dec 01 '19 at 18:50
  • I was hoping to keep things compartmentalised, by preparing one .mat file on one computer and running the analysis on the faculty machine. I suppose I'll follow your suggestion if there's no way to do it. Thanks! – ag3nt Dec 02 '19 at 23:34

0 Answers0