0

I'm a student, and my Operating Systems class project has a little snag, which is admittedly a bit superfluous to the assignment specifications itself:

While I can push 1 million deques into my deque of deques, I cannot push ~10 million or more.

Now, in the actual program, there is lots of stuff going on, and the only thing already asked on Stack Overflow with even the slightest relevance had exactly that, only slight relevance. https://stackoverflow.com/a/11308962/3407808

Since that answer had focused on "other functions corrupting the heap", I isolated the code into a new project and ran that separately, and found everything to fail in exactly the same ways.

Here's the code itself, stripped down and renamed for the sake of space.

#include <iostream>
#include <string>
#include <sstream>
#include <deque>

using namespace std;

class cat

{
    cat();
};

bool number_range(int lower, int upper, double value)
{
    while(true)
    {
        if(value >= lower && value <= upper)
        {
            return true;
        }
        else
        {
            cin.clear();
            cerr << "Value not between " << lower << " and " << upper << ".\n";
            return false;
        }
    }
}

double get_double(char *message, int lower, int upper)
{

    double out;
    string in;

    while(true) {

        cout << message << " ";
        getline(cin,in);
        stringstream ss(in); //convert input to stream for conversion to double

        if(ss >> out && !(ss >> in))
        {
            if (number_range(lower, upper, out))
            {
                return out;
            }
        }
        //(ss >> out) checks for valid conversion to double
        //!(ss >> in) checks for unconverted input and rejects it

        cin.clear();
        cerr << "Value not between " << lower << " and " << upper << ".\n";
    }
}

int main()
{
    int dq_amount = 0;
    deque<deque <cat> > dq_array;
    deque<cat> dq;

    do {
        dq_amount = get_double("INPUT # OF DEQUES: ", 0, 99999999);
        for (int i = 0; i < number_of_printers; i++)
        {
            dq_array.push_back(dq);
        }
    } while (!number_range(0, 99999999, dq_amount));
}

In case that's a little obfuscated, the design (just in case it's related to the error) is that my program asks for you to input an integer value. It takes your input and verifies that it can be read as an integer, and then further parses it to ensure it is within certain numerical bounds. Once it's found within bounds, I push deques of myClass into a deque of deques of myClass, for the amount of times equal to the user's input.

This code has been working for the past few weeks that I've being making this project, but my upper bound had always been 9999, and I decided to standardize it with most of the other inputs in my program, which is an appreciably large 99,999,999. Trying to run this code with 9999 as the user input works fine, even with 99999999 as the upper bound. The issue is a runtime error that happens if the user input is 9999999+.

Is there any particular, clear reason why this doesn't work?

Oh, right, the error message itself from Code::Blocks 13.12:

terminate called after throwing an instance of 'std::bad_alloc'

what(): std::bad_alloc

This application has requested the Runtime to terminate it in an unusual way. Please contact the application's supprt team for more information.

Process returned 3 (0x3) execution time : 12.559 s Press any key to continue.

I had screenshots, but I need to be 10+ reputation in order to put images into my questions.

Community
  • 1
  • 1
ArnieJ
  • 17
  • 6
  • because you ran out of memory? – Bryan Chen May 14 '14 at 02:18
  • I don't know, did I? I have an 8gb ram machine, and it's idling at 34% ram usage. 10 million things to keep track of seems paltry, unless each container is ~570 bytes or more. – ArnieJ May 14 '14 at 02:22

1 Answers1

1

This looks like address space exhaustion.

If you are compiling for a 32-bit target, you will generally be limited to 2 GiB of user-mode accessible address space per process, or maybe 3 GiB on some platforms. (The remainder is reserved for kernel-mode mappings shared between processes)

If you are running on a 64-bit platform and build a 64-bit binary, you should be able to do substantially more new/alloc() calls, but be advised you may start hitting swap.

Alternatively, you might be hitting a resource quota even if you are building a 64-bit binary. On Linux you can check ulimit -d to see if you have a per-process memory limit.

Jeff
  • 3,475
  • 4
  • 26
  • 35
  • It's not a very big deal. Just know that as your heap/free store grows, the C/C++ runtime library needs to request more address space from the OS. These requests will eventually fail when an artificial cap or platform limit is hit, triggering an `std::bad_alloc`. You can watch your process's memory usage with Task Manager/top/whatever and see how big it gets before it dies. You could also try catching the exception and pausing before exiting to make it easier to see when it's dying if you need. – Jeff May 14 '14 at 02:42
  • I honestly don't understand everything you're saying, but I'll work through it: I'm running my machine on 64-bit Win7, running Code::Blocks with what I believe was the 64-bit GNU GCC compiler... I'm not certain I'm talking about the right thing when I say it's not the 32-bit mingw compiler but the other one. But the major thing I'm taking away from your comment is that if I'm not changing the settings to something special, I will probably have a 2 GiB limit for my program. Is this correct, or are you saying I can only call pointers on 2 GiB of memory? – ArnieJ May 14 '14 at 02:43
  • Ah, I see you started making my response to my deleted comment. Okay, so I'll start using the Throw/Catch/Try stuff that I was told about but haven't actually started using yet. I get the gist now, thanks! – ArnieJ May 14 '14 at 02:44
  • @ArnieJ if you're not compiling to a 64bit target, then you're compiling to a 32bit target under Windows. A 32bit host process on Windows has a normal 2GB user-addressable limit of virtual address space. You cannot address more memory than that unless you're using a 32bit version of Windows with 4GT-enabled, or a 64bit version of windows with IMAGE_FILE_LARGE_ADDRESS_AWARE set as a loader option for your 32bit process. [See this for more info](http://msdn.microsoft.com/en-us/library/windows/desktop/bb613473(v=vs.85).aspx). (your code has no public constructor for class `cat`; fix that.) – WhozCraig May 14 '14 at 02:49
  • Actually, you just gave me enough info to perhaps help you a bit further. Last I heard, the vanilla version of MinGW only supports 32-bit builds, so you need to use the MinGW-w64 fork or similar if you want to build 64-bit binaries. 32-bit Windows applications only allow 2 GiB of memory to be addressed unless you build with `/3gb` in Visual Studio or I don't know in gcc. This is very likely the limit you're hitting. – Jeff May 14 '14 at 02:50
  • Thanks, both of you guys. I think I have enough information to decide how I want to go about allowing this to progress, and now I definitely know why it's not just some random, inexplicable error. – ArnieJ May 14 '14 at 02:58
  • Glad we could help. As a final note, `sizeof(std::deque)` will run ~40B on most 32b targets, so you will probably die when `number_of_printers` times your range size is greater than about 53M, or much sooner if you're putting anything at all in the inner deques. A lot of programming comes down to simple arithmetic, and the amount of memory you have available isn't perhaps as vast as you might first guess. – Jeff May 14 '14 at 03:24