I have a large MATLAB file (150MB) in matrix form (i.e. 4070x4070). I need to work on this file in MATLAB but I can't seem to load this file. I am getting an "out of memory" error. Is there any other way I can load this size of file? I am using a 32bit processor and have 2GB of RAM. Please help me, I am getting exhausted from dealing with this problem.
-
4Is it a *.mat file format, or some other format? – hatboyzero Feb 03 '12 at 19:05
-
1I am very surprised that a 150mb file could fill up 2GB of memory when opened! How much free memory does your machine have before you attempt the load? Are there any other programs you could close? – japreiss Feb 03 '12 at 19:20
-
If this is a .mat file, does `whos -file
` also cause problems, or just `load`? If it's not a .mat file, is there header text? – reve_etrange Feb 03 '12 at 21:22
3 Answers
Starting from release R2011b (ver.7.13) there is a new object matlab.io.MatFile with MATFILE as a constructor. It allows to load and save parts of variables in MAT-files. See the documentation for more details. Here is a simple example to read part of a matrix:
matObj = matfile(filename);
a = matObj.a(100:500, 200:600);
If your original file is not a MAT file, but some text file, you can read it partially and use matfile
to save those parts to the same variable in a MAT file for later access. Just remember to set Writable
property to true in the constructor.
Assuming your text file is tab-delimited and contains only numbers, here is a sample script to read the data by blocks and save them to MAT file:
blocksize = 100;
startrow = 0;
filename = 'test.mat';
matObj = matfile(filename,'Writable',true);
while true
try
a = dlmread(filename,'\t',startrow,0); %# depends on your file format
startrow = startrow + blocksize;
matObj.a(startrow+(1:blocksize),:) = a;
catch
break
end
end
I don't have the latest release now to test, but hope it should work.

- 19,098
- 13
- 68
- 99
If it is an image file, and you want to work with it, try the matlab block processing. By using it, you will load small parts of the file. Your function fun
will be applied to each block individually.
B = blockproc(src_filename,[M N],fun)
In case it is an xml
file, try the XML DOM Node
mode together with SAX
- (Thanks to @Nzbuu for pointing that out), but that seems to be an undocumented functionality.
Also, if it is a textual file of any kind (Unlikely, due to the amount of data), try external tool to split.

- 20,795
- 11
- 69
- 104
-
1XML DOM reads the whole file into memory first, so that won't help you here; SAX streams the XML and raises events to process the data. Also, I've found processing the DOM objects in MATLAB to be slow. – Nzbuu Feb 03 '12 at 19:15
-
@Nzbuu, thanks. For some reason I though that Matlab always uses SAX. Quick google search revealed that you are correct. – Andrey Rubshtein Feb 03 '12 at 19:40
-
You can also user MATLAB's Memory-Mapping of Data Files to read in a block of the file, process it, and proceed to the next block without having to load the entire file into memory at once.
For instance, see this example, which "maps a file of 100 double-precision floating-point numbers to memory."

- 12,608
- 13
- 46
- 53
-
I have tried this object in earlier release when it just appeared in MATLAB. Interestingly MATLAB had to load the whole file into the memory anyway. I contacted with the Mathworks support and they confirmed it. They basically promoted this feature as ability to share the data between different applications at the same time. Well, it looks like things have changed. I've tried it again with R2011a and the `memmapfile` object occupies only 300b. Nice answer. +1. The file has to have regular structure for this object (binary only?), so the input file will probably need to be reformatted anyway. – yuk Feb 06 '12 at 18:55