1

After a long time searching for an answer and not having found one, my last resort is asking a new question. I create multiple (N=1000) -mat v-6 files that are each about 100 MB in size and contain a single matrix. In a separate part of my code, I need to load in each file. The problem I'm running into is that loading the files in suddenly becomes very time consuming around file number 600 and I'm not sure why its happening. Thanks in advance for any suggestions.

I'm using Matlab R2014b on a Mac with 16GB of ram.

Sample code

c=nan(1,1000)
for h=1:1000 
    tic
    filename=[basefilename,'_',num2str(h),'.mat'];
    transition=load(filename,'P')
    c(h)=toc;
end

Here is an image of the recorded loading times using the exact code above

J. Lopez
  • 11
  • 2
  • 1
    Does it have anything to do with your hardware? You should check how much memory and such is being used to see how it's being handled. – Laurel May 16 '16 at 22:12
  • What version of MATLAB are you using? – gariepy May 16 '16 at 22:22
  • I think your system has used up all the RAM and now storing in Hard Disk. In windows we call it `virtual memory` and in linux we call it `swap`. And I don't have to say the difference in writing speed of `RAM` and `Hard Disk`. If the Hard disk is `SSD`. You may get a better speed but still cant compete with speed of `RAM`. But what @gariepy asked is a reasonable question. 100GB data + memory used by OS + memory used by active applications. Even with swap / virtual memory its too much. Aren't you getting an out of memory error from matlab or OS? – Rijul Sudhir May 16 '16 at 22:23
  • Is there any way around using up the ram, I don't save the matrix that I've read into memory, I just need it to perform one operation and I could overwrite it with the one for the next iteration – J. Lopez May 16 '16 at 22:25
  • Then you shouldn't be facing any out of memory issue. – Rijul Sudhir May 16 '16 at 22:29
  • Test your `RAM` with some memory testing software. But it is probably matlabs problem. Anyway try it. Then try compiling your code to mex. Since compiled codes runs faster. – Rijul Sudhir May 16 '16 at 22:44
  • If you really want to speed things up. Then use a `Database` to store your data. First save your mat file to csv or something and import it to database table. Read the values when required directly from database and do your processing. – Rijul Sudhir May 16 '16 at 22:48
  • 1
    Add `if (h>850) memory end` inside your loop and put a break point at `end` then run your code. when `h>850` break point will trigger and check the command window and check `Memory used by MATLAB`. Then you can know how much memory is used at `851`. – Rijul Sudhir May 16 '16 at 22:56
  • fclose('all') should help – Mikhail Genkin May 16 '16 at 23:27
  • Rijul- Since I'm running on a mac, the memory command isn't available to me. Mikhail- Unfortunately, fclose('all') didn't seem to help at. – J. Lopez May 17 '16 at 18:11

0 Answers0