0

I need to allocate space for a temporary array once per iteration. I try to use realloc each iteration to optimize memory using. Like that:

int *a = (int*)std::alloc(2 * sizeof(int));
for(int i=0; i<N; ++i) 
{
    int m = calculate_enough_size();
    a = (int*)std::realloc(m * sizeof(int));
    // ...
}
std::free(a);

N is a big number, 1000000 for example. There are example m values per iteration: 8,2,6,10,4,8

Am I doing right when I realloc a at each iteration? Does it prevent redundant memory allocation?

KiaMorot
  • 1,668
  • 11
  • 22
user1312837
  • 1,268
  • 1
  • 11
  • 29

3 Answers3

3

Firstly, realloc takes 2 parameters. First is the original pointer and the second is the new size. You are trying to pass the size as the original pointer and the code shouldn't compile.

Secondly, the obligatory reminder: Don't optimize prematurely. Unless you've measured and found that the allocations are a bottleneck, just use std::vector.

eerorika
  • 232,697
  • 12
  • 197
  • 326
1

Few issues I have noticed are:

  • Realloc should be used in case you want older values remain in the memory, if you didn't bother about old values as mentioned in one of your comment use just alloc.

  • Please check size of already allocated memory before allocating again, if allocated memory is insufficient for new data then only allocate new memory.

Please refer to the sample code which will taking care of above mentioned problems:

int size = 2;   
int *a = (int*)std::alloc(size  * sizeof(int));

for(int i=0; i<N; ++i) 
{
   int m = calculate_enough_size();
   if(m > size)
   {
       size = m;
       std::free(a);
       a = (int*)std::alloc(size * sizeof(int));
   }
   // ...
}
std::free(a);

Also you can further optimized memory allocation by allocating some extra memory, e.g:

size = m*2; //!

To better understand this step let's take an example suppose m = 8, then you will allocate memory = 16, so when now m changes to 10, 12 up-to 16 there is no need to allocate memory again.

Hemant Gangwar
  • 2,172
  • 15
  • 27
0

If you can get all the sizes beforehand, allocate the biggest you need before the cycle and then use as much as needed.

If, on the other hand, you can not do that, then reallocation is a good solution, I think.

You can also further optimize your solution by reallocating only when a bigger size is needed:

int size = 0;
for(int i = 0; i < N; ++i) 
{
    int new_size = calculate_enough_size();
    if ( new_size > size ){
        a = (int*)std::realloc(new_size * sizeof(int));
        size = new_size;
    }

    // ...
}

Like this you will need less reallocations (half of them in a randomized case).

grzkv
  • 2,599
  • 3
  • 26
  • 37