-1

I have an application which basically goes like this:

Init();
while(true){
    read_data();
    process_data();
    write_processed_data();
}

the data allocated in a passage of the loop is entirely discarded once we have finished write_processed_data. What's more the amount of allocation in each loop passage is bounded by a constant.

I thought I could allocate by using something using a placement new on a moveable pointer inside a fixed size block, and then at the end of the loop passage I just reset the "moveable pointer" to the start of the block, thus avoiding delete altogether. However my attempts have not been very successful, it seems harder to achieve than initially planned. Is this a common allocation strategy? In a program of this kind I could gain in speed and not have to worry about memory leaks.

Shawn
  • 47,241
  • 3
  • 26
  • 60
user1741137
  • 4,949
  • 2
  • 19
  • 28
  • IS the amount of data variable? If so does it have an upper bound? – Ed Heal Jul 25 '14 at 19:32
  • I worked on an application that had a similar memory allocation strategy which worked out well. What kind of problems are you facing? – Jason Jul 25 '14 at 19:38

3 Answers3

2

IF you have a upper bound you do not need the heap in the first place.

That is the best solution.

Ed Heal
  • 59,252
  • 17
  • 87
  • 127
  • This should be at the top of the list of answers, if only because of how much it adds to the discussion in so few words; let people read this answer first and then read any others in that light. – David K Jul 25 '14 at 21:12
1

Thanks to @EdHeal for pointing out the basic idea behind the following approach:

Before your loop, within your block of memory "allocate" large-enough arrays of all the concrete data types you will need to use. Use these as your set of object pools. Unlike the typical implementation of object pools, you never have to grow your pools, because when you create the arrays at the start you you know an upper bound on the number of objects of each type that you might use. (Your problem statement implies that you have this information at least before entering the loop.)

During the loop, each time you need a new object, get its pointer from its pool. You can implement a function that puts the pointer back in the pool to be used as a "new" object again, or if there is a known limit on the total number of "new allocations" that isn't too much greater than the highwater number of objects in use at any one time, you don't even need to bother putting objects back in the pool. (The second option would make the pool management very easy, as all you have to do is keep count of the number of objects that have been taken from each pool, and that tells you which object is the next "new" one.) In either case, at the end of the loop, you reset the pools so that all objects are available to be "new" again.

I'd probably create a class called something like Allocator to manage the pools, with a function for each kind of object it can take from the pool or put back in the pool and a reset() function to call at the end of the loop. There may be more elegant ways to do the job.

Note that if you know the upper bounds of your required numbers of objects precisely enough, well enough in advance, you can create all of the above-mentioned data structures without putting anything on the heap.

David K
  • 3,147
  • 2
  • 13
  • 19
  • You can put the lot on the stack - like the software I wrote for a power station that does not use the heap at all – Ed Heal Jul 25 '14 at 20:57
  • Right, that was what I had in mind, though the ease with which you can do it might depend on how constant that "constant" in the original question really is. – David K Jul 25 '14 at 21:09
0

One way to avoid lots of unneeded news and deletes is to use an object pool and override a class's new and delete operators. Basically:

// In header file
class Thing{...}

// In .cc file
namespace thingpool {
static std::vector<Thing*>* chunks = 0;
} 

void* Thing::operator new(size_t size){
  // If pool is not initialized
  if (!thingpool::chunks){
    thingpool::chunks = new std::vector<Thing*>;
    thingpool::chunks->reserve(100);  // or some upper bound on expected Things
  }

  void* memptr;
  if(!thingpool::chunks->empty()) {
    memptr = thingpool::chunks->back();
    thingpool::chunks->pop_back();
  } 
  else {
    memptr = ::malloc(size);
  }
  return memptr;
}

// put memory back in pool
void Thing::operator delete(void* ptr)
{
  if (thingpool::chunks)
    thingpool::chunks->push_back(static_cast<key*>(ptr));
}

And clean up the thing pool at some point.

jonas25007
  • 231
  • 1
  • 3
  • 1
    Instead of overloading `operator new` and `operator delete` for the class, an allocator would make more sense to me. – Mooing Duck Jul 25 '14 at 19:51