I'm running a simple program to show the memory capacity between the stack and the heap in the scope of a single process, and am receiving some strange values that I may be misunderstanding.
I'm running this block of code:
int main() {
uint64_t *heap = new uint64_t;
uint64_t stackvar;
uint64_t diff = &stackvar - heap;
float size = (float) diff/ (1024 * 1024 * 1024); // conversion to GB
free(heap);
return 0;
}
The address values of the allocated heap variable and the stack variable make sense, but I believe I am misinterpreting how to calculate the memory capacity between the two, as the final value of diff
is in the range of 16TB once I run it through the conversion in the code above.
How do I effectively convert this range to an understandable memory range, or what is the proper interpretation of diff
if I have made a mistake?