I'm running the following c++ code on Ubuntu with 4GBs of RAM
const long long nSize = 400000000;
double Array1[nSize];
for(int i=0; i<nSize; i++)
Array1[i]= 2*2; // store on the stack
And this fits on RAM (and my computer doesn't complain). Confusingly... htop says barely any additional RAM is being used during runtime... Why?. (I usually let it sleep for 100 seconds in case it needs to update)
On the other hand, if I dynamically allocate the huge array (as tutorials like this recommend I should do) -- htop tells me it's using up most of the RAM (if not all of it and it crashes):
double *pnArray2 = new double[nSize];
for(int i=0; i<nSize; i++)
pnArray2[i] = 2*2; // store on the heap
So why should I use the heap to store big data structures... if (like in this example) the stack can handle even bigger arrays?
I thought the heap was supposed to be bigger than the stack! Please tell me where I'm going so wrong.