1

I need to build a bar gragh that illustrate a distribution of pseudorandom numbers that determined by linear congruential method

Xn+1 = (a * Xn + c) mod m
U = X/m

on the interval [0,1]

For example: Interval Frequency

[0;0,1]            0,05
[0,1;0,2]          0,15
[0,2;0,3]          0,1
[0,3;0,4]          0,12
[0,4;0,5]          0,1
[0,5;0,6]          0,15
[0,6;0,7]          0,05
[0,7;0,8]          0,08
[0,8;0,9]          0,16
[0,9;1,0]          0,4

I have written such a program

lcg.h:

class LCG {
public:
    LCG();
    ~LCG();
    void setSeed(long);
    float getNextRand();
    void countFrequency();
    void printFrequency();

private:
    vector<int>frequencies;
    long seed;
    static const long a = 33;
    static const long c = 61;
    static const long m = 437;
};

lcg.cpp:

void LCG::setSeed(long newSeed)
{
    seed = newSeed;

}



LCG::LCG() {
    setSeed(1);

}

LCG::~LCG() { }

float LCG::getNextRand() {
    seed = (seed * a + c) % m;
    return (float)seed / (float)m;
}

void LCG::countFrequency()
{


    for (int i = 0; i < 10; ++i)
        frequencies[i] = 0;
    for (int i = 0; i < m; ++i)
    {
        float u = getNextRand();
        int r = ceil(u * 10.0);
        frequencies[r] = frequencies[r] + 1;
    }
}

void LCG::printFrequency()
{

    for (int i = 0; i < 10; ++i)
    {
        const float rangeMin = (float)i / 10.0;
        const float rangeMax = (float)(i + 1) / 10.0;
        cout << "[" << rangeMin << ";" << rangeMax << "]"
            << " | " << frequencies[i] << endl;
    }
}

main.cpp:

int main()
{
    LCG l;
    l.countFrequency();
    l.printFrequency();
}

It compiles and lint properly, but do not want to run. I have no idea what is wrong with my program. something wrong with functions countFrequency and printFrequency. But I can not figure out what. Maybe you know?

SingerOfTheFall
  • 29,228
  • 8
  • 68
  • 105
niar_q
  • 153
  • 9
  • Is that your full program? If so: you need a `main` function. If not, can you clarify the problem? How do you compile it? What do you mean by saying it does "not want to run"? What happens when you try to run it? – Chris H Oct 20 '15 at 11:18
  • Have you tried running in a debugger? What happens then? – Some programmer dude Oct 20 '15 at 11:20
  • 1
    1. `frequencies` is not sized correctly, 2. Use `double` not `float`, 3. Ditch that destructor. Let the compiler use the default one. – Bathsheba Oct 20 '15 at 11:21

1 Answers1

2

This part is wrong:

for (int i = 0; i < m; ++i)
    frequencies[i] = 0;

At this point your frequencies is empty, and you can't access it's elements like this: the index is out-of-bounds, which is causing the crash. To populate the vector, use push_back():

for (int i = 0; i < m; ++i)
    frequencies.push_back(0);

Other minor stuff:

  • your constructor does too much work:

    LCG::LCG() {
        setSeed(1);    
    }
    

    the proper way would be to use initializer lists: LCG::LCG() : seed(1){ }

  • If you don't do anything special in the destructor, don't define it at all, let the compiler do it for you.

  • Use double instead of float for some extra precision; ceil operates doubles anyway.
SingerOfTheFall
  • 29,228
  • 8
  • 68
  • 105