5

I want to allocate about 10 GB on RAM. But I get the error:

error C2148: total size of array must not exceed 0x7fffffff

my simplified code is:

int main(){
    char* myBuffer = new char[11000000000];
    //char* myBuffer = new char[4000000000]; //compiled successfully
}

I know about differences between x86 and x64 and addressing size limit in x86. So I set my target to x64. I also know about stack size limit, but please note that I am allocating on heap.

It is suprising that when I use the below code and it is compiled successfully.

#include <memory>
int main(){
    char* myBuffer = (char*) malloc(11000000000); //compiled successfully even much more than this size
}

What is wrong with my code when I use new operator?

The Environment: Visual studio 2013 using an empty project, Windows server 2008 R2 Datacenter, 128 GB RAM.

Edit: The link provided by n.m. doesn't answer my question completely. I also want to know why does malloc work well but not new?

IndustProg
  • 627
  • 1
  • 13
  • 33
  • 2
    `char* myBuffer = (char*) malloc[11000000000];` - error: pointer to a function used in arithmetic [-Wpointer-arith] check your compiler warning/error level – Richard Critten Sep 26 '17 at 10:31
  • That does not answer my question. It is solved by using `std::vector`. But I want to know what is wrong with `new` operator here. – IndustProg Sep 26 '17 at 10:35
  • 3
    Your C++ compiler has a limit on the maximum size of an array. That's it. – Sam Varshavchik Sep 26 '17 at 10:37
  • 2
    Looks like you are compiling a 32 bit application 0x7fffffff == 2,147,483,647 and are limied to 2Gb – Richard Critten Sep 26 '17 at 10:37
  • @RichardCritten as I mentioned before I set target to x64. – IndustProg Sep 26 '17 at 10:39
  • @SamVarshavchik I tried Intel C++ compiler 2017, but the error still appears. With this difference that the error is: `error: array is too large` – IndustProg Sep 26 '17 at 10:40
  • Compiles with VS2017 and VS2015. –  Sep 26 '17 at 10:43
  • What is the value of `size_t` on your machine? Do you really have `[]` after `malloc` or is it just a typo in question? – taskinoor Sep 26 '17 at 10:48
  • Also just because you have 128G physical memory doesn't mean OS will assign all those memory to your process. If I'm not mistaken, modern OS assigns heap spaces per process and there might be some limit there as well. – taskinoor Sep 26 '17 at 10:52
  • Thanks, It is typo, I corrected it. I ran `cout << sizeof(size_t) << endl` and it printed `8`. – IndustProg Sep 26 '17 at 10:53
  • You mentioned that you increased the stack size, but did you also increase the heap size? Edit: Like [this](https://msdn.microsoft.com/en-us/library/windows/desktop/f90ybzkh(v=vs.120).aspx)? – AndyG Sep 26 '17 at 10:53
  • @AndyG Yes, I did. I went to `Project properties> Linker> System` and increased `Heap reserve size` and `Heap commit size` to 80,000,000,000. But nothing and the error still appeared! – IndustProg Sep 26 '17 at 10:59
  • @taskinoor `typedef unsigned __int64 size_t`. Did you mean this? – IndustProg Sep 26 '17 at 11:03
  • @GntS yes it is 64 bit, so that's not problem. From the duplicate it looks like an issue with older VS compiler. – taskinoor Sep 26 '17 at 11:09
  • If you have an array with more elements than the max value of `ptrdiff_t`, some operations will not work because of integer overflow - like computing the distance between the first and the last element. So having a limit is quite reasonable. Having *the same* limit for 32-bit and 64-bit systems may or may not be a bug, depending on whether it was done on purpose. – Bo Persson Sep 26 '17 at 12:33
  • What exact part of your question is not answered by this statement: **"you have stumbled upon a compiler bug"**? – n. m. could be an AI Sep 26 '17 at 14:43

0 Answers0