I read one answer about using std::vector::emplace_back
on std::vector<std::unique_ptr<SomeClass>>
and getting memory leaks when passing raw pointers (my_vec.emplace_back(new SomeClass())
) in some exceptional scenario. Since that post is old (2016) and we have new standards now, I thought of trying an experiment if this issue is somehow fixed.
#include <iostream>
#include <memory>
#include <vector>
#include <limits>
struct MyClass
{
~MyClass() { std::cout << "Destroyed\n"; }
};
int main()
{
std::vector<std::unique_ptr<MyClass>> store;
auto limit = std::numeric_limits<uint32_t>::max();
std::cout << "Trying to allocate " << limit << " unqiue ptrs\n";
for (size_t i = 0; i < limit; i++)
{
store.emplace_back(new MyClass()); // THIS LINE
}
}
I saw that compilers didn't throw any errors or warnings for the line marked with the comment // THIS LINE
.
It's been a long time since we know this issue. Of course, the solution to avoid this is to wrap those raw pointers around smart pointers. But still, if somebody attempts to use raw pointers anyway, then compilers should warn them which doesn't happen.
Is there any work ongoing regarding this? If there are already some ways to detect such issues, please share them.
Code link: https://godbolt.org/z/d5ooeEWf5
P.S. I thought of trying this experiment because I like C++ and I found Rust fans pointing out issues with C++ which makes me sad.