I have std::set having large number unique objects as its elements.
In the main thread of program:
- I take some objects from the set
- Assign data to be processed to each of them
- Remove those objects from set
- And finally pass the objects to threads in threadpool for processing
- Once those threads finish processing objects, they adds them back to the set. (So that in the next iteration, main thread can again assigns next batch of data to those objects for processing)
This arrangement works perfect. But if I encounter error while adding back object to the set (for example, std::set.insert()
throws bad_alloc
) then it all goes on toss.
If I ignore that error and proceed, then there is no way for the object to get back in the processing set and it remains out of the program flow forever causing memory leaks.
To address this issue I tried to not to remove object from set. Instead, have a member flag that indicates the object is 'being processed'. But in that case the problem is, main thread encounters 'being processed' objects again and again while iterating through all elements of set. And it badly hampers performance (Number of objects in set are quite large).
What are better alternatives here?
Can
std::list
be used instead ofstd::set
? List will not havebad_alloc
problem while adding back element, as it just needs to assign pointers while adding element to list. But how can we make list elements unique? If at all we achieve it, will it be efficient as std::set?Instead of removing and adding back elements to the std::set, is there any way to move element to the start or end of the set? So that unprocessed objects and processed will accumulate together towards start and end of the set.
Any other solution please?