-1

i cant understand the following scenario.

Initial Matlab Memory on Activity Monitor(on mac) = 969.4MB

When I load my .mat file having 80x60x13x15238 images that show size 2.1 GB on Hardisk. The Memory Usage on Activity Monitor reaches 7.80GB and Compressed Memory to 172.6MB.

When I start simulations the memory Usage reaches 8.22GB and Compressed memory 6.13 GB.

I have 3 Questions? 1) Why the memory usage reaches so high for 2.1 GB data? 2) How can i reduce it or which format is best that it may take less memory? 3) And if there is another format, than is it fast to load and use?

Regards.

Daniel
  • 36,610
  • 3
  • 36
  • 69
khan
  • 531
  • 6
  • 29

1 Answers1

2
  1. You are working with 7.08 GiB of data, the Mat-File has a size of 2.1 GiB because it's a gzip compressed HDF5 file.
  2. You are loading 7.08 GiB of data, this requires 7.08 GiB of memory
  3. You are using the right format. Your data is to large.
Daniel
  • 36,610
  • 3
  • 36
  • 69
  • Ok, first part of my question is clear, . Now can you guide me some how regarding the second and third part? – khan Feb 21 '15 at 18:45
  • Your raw data is 7.08 GiB, that's not Matlab causing the size. 80x60x13x15238x64 bit is simply that large. If you want to load that many double values, you need 7.08 GiB of memory. Do you really need all the data in memory at once? – Daniel Feb 21 '15 at 19:00
  • I need 80x60x13 as one input, and its a neural network. Which read all 15238 samples. so some how i need all of them, but all the data is saved in one file. and I have to read this 15238 samples for several epochs untill the error minimizes to zero or as much as possible. – khan Feb 21 '15 at 20:36