2

I can initialize the vector with 10^8,but I can't initialize it with 10^9.Why?

 vector<int> bucket;
 bucket.resize(100000000);      √ 
 bucket.resize(1000000000);     ×
张鸿宇
  • 15
  • 4
  • 1
    let a testprogram print `std::numeric_limits::max()` and check the size. – skratchi.at Mar 01 '19 at 06:15
  • The error message you get should give you an indication of what is wrong – P.W Mar 01 '19 at 06:18
  • `sizeof (int)` is probably at least 4. A vector with 1000000000 would require 4000000000 bytes. That's a little bit less than 4 GB. It might work on a 64 bit platform (and sufficient memory available in OS). On a 32 bit platform, I'm not that sure. Btw. on which of them you are on? (Platform - compiler as you could even on certain 64 bit OSes compile and run 32 bit applications.) – Scheff's Cat Mar 01 '19 at 06:18
  • the OS is windows 64bit, – 张鸿宇 Mar 01 '19 at 06:25
  • Is it possible to allocate the memory in steps? or is the mem of a `std::vector`always in on block? – skratchi.at Mar 01 '19 at 06:31
  • Are you using VisualStudio? (If so) Which platform have you chosen - x86 (32 bit) or x64 (64 bit)? – Scheff's Cat Mar 01 '19 at 06:34
  • I use the vs code – 张鸿宇 Mar 01 '19 at 06:46
  • 1
    @skratchi.at `vector` always allocates in one big block. – user4581301 Mar 01 '19 at 06:54
  • _I use the vs code_ With which compiler? g++? It supports as well x86 and x64 but I don't know the resp. option to chose. – Scheff's Cat Mar 01 '19 at 06:59
  • Yes,the compiler is g++. – 张鸿宇 Mar 01 '19 at 07:03
  • When we were developing on VS with x86 (32 bit), the actual limit for memory allocations was actually much below 4 GB. Although 32 bits theoretically allows addressing up to 4 GB, the actual possible size was lower because parts of address range were reserved for OS. Furthermore, the OS and the applications may consume itself part of the memory. Additionally, the heap management is organized in blocks which are somehow distributed in memory. So, a contiguous block of (nearly) 4 GB is surely not allocatable on a 32 bit platform - at least, not on Windows. – Scheff's Cat Mar 01 '19 at 07:05
  • Do you have the actual need for the vector of 1000000000 elements or was it just testing out of curiosity? In the former case, chosing x64 platform for compiler may help. (Of course, a 64 bit Windows (which you have) is mandatory to run 64 bit code.) – Scheff's Cat Mar 01 '19 at 07:10
  • @user4581301 ...except that paging will allow to use non-contiguous physical memory to provide a contiguous block of virtual memory to the application. ;-) (Am I nit-picking?) – Scheff's Cat Mar 01 '19 at 07:14
  • I googled for quite a while and ended up finally on [gcc options](https://gcc.gnu.org/onlinedocs/gcc/Option-Summary.html). It seems `-m64` is the one to force g++ to generate x64 code. You might wonder why this isn't the default. I'm not sure. (I'm even not sure how to get the default.) However, even MS provides its VS/VC as 32 bit code (although it supports generation of 32 bit as well as 64 bit code). 32 bit applications might have a little bit lesser requirements concerning memory consumption. (The only reasonable explanation which comes in mind.) – Scheff's Cat Mar 01 '19 at 07:38

2 Answers2

3

It's because resize function will apply memory from heap. As you can figure that, the size will be 4000000000 bytes in your second resize operation, which is larger than the space your system could allocate(may be your computer couldn't find a piece of continuous space for you), and will cause exception and failure.

The maximum memory you can apply for depends on many reasons as follow:

  1. the hardware limitation of physical memory.
  2. the os bit(32 or 64)
  3. memory left for user. Operating system should meet the kernel's need first. Generally speaking, windows kernel needs more memory than linux or unix.
  4. ..

In a word, it is hard to know accurate memory size you can use, because it's a dynamic value. But you can make a rough estimation by new operator, and here is a good reference.

YaleCheung
  • 630
  • 3
  • 9
1

C++ vectors allocate memory in a contiguous block and it is likely that the operating system cannot find such a block when the block size gets too large.

Would the error message you are getting indicate that you are running out of memory?

The point is: Even if you think that you have enough memory left on your system, if your program's address space can not accommodate the large block in one chunk then you cannot construct the large vector (the maximum address space size may differ for 32-bit and 64-bit programs).

J.R.
  • 1,880
  • 8
  • 16