3

Recently, when working with custom allocator code and placement new+delete, I noticed something that surprised me: When a virtual destructor is called, it writes to the object's soon-to-be-freed memory.

Why is that?

(update) Aside: I'm more interested in the actual behavior here, not what the C++ standard has to say, which I'm sure does not specify this behavior.

Here's a small program to demonstrate:

#include <new>
#include <cstring>
#include <iostream>

using std::cout;
using std::endl;

struct Interface {
    virtual ~Interface() = default;
};

struct Derived : public Interface {
};

alignas(Derived) unsigned char buffer[sizeof(Derived)];

int main() {

    memset(buffer, 0xff, sizeof(buffer));
    cout << "Initial first byte: 0x" << std::hex << (int)buffer[0] << endl;
    
    // Create an instance, using 'data' as storage
    Derived *pDer = ::new (buffer) Derived();
    cout << "After ctor, first byte: 0x" << std::hex << (int)buffer[0] << endl;
    
    pDer->~Derived();
    
    cout << "After destroy, first byte: 0x" << std::hex << (int)buffer[0] << endl;

    return 0;
}

Live link: https://godbolt.org/z/jWv6qs3Wc

Here is the output:

Initial first byte: 0xff
After ctor, first byte: 0x68
After destroy, first byte: 0x88

If I remove the virtual Interface, then the memory never changes at all, as expected.

Is this some kind of debug functionality?

It seems compiler-specific. Clang does not do it, but GCC does.

It does seem to go away with -O2. But still, I'm not sure it's purpose.

jwd
  • 10,837
  • 3
  • 43
  • 67
  • I doubt that the standard har anything to say about this (except possible UB?). More likely a matter of compiler implementation details. – super Apr 23 '21 at 04:50
  • Perhaps it writes a pattern to make debugging easier? – Ted Lyngmo Apr 23 '21 at 05:04
  • 5
    It is common practice for debug compilations to overwrite freed or uninitialized memory with a bit pattern that will A) likely cause crashes when used; B) assist in recognizing other issues such as buffer overruns. – paddy Apr 23 '21 at 05:11
  • 1
    Your program causes undefined behaviour by accessing objects that no longer exist (the `new` causes lifetime of the chars in the char array to end) – M.M Apr 23 '21 at 05:12
  • 3
    Also the buffer might not be correctly aligned for `Derived` – M.M Apr 23 '21 at 05:14
  • 2
    All the standard mandates is that calling the destructor ends the life of the affected object. Apart from anything explicitly done by the destructor, the standard neither requires nor prevents the implementation modifying the memory occupied by the object. Odds are, what you're seeing is something provided by your compiler to assist debugging or something related to alignment of the object relative to the buffer you have provided. – Peter Apr 23 '21 at 05:17
  • @M.M oh, thanks for the `alignas` note; I fixed it (same behavior). I agree it is undefined behavior as written; I'm really trying to understand the specifics of what is *actually* happening, not just what the standard has to say. I'll clarify that in the question. I wonder: is it still undefined behavior if it is a `void*` coming from `malloc`? Surely `free` is allowed to access that memory afterwards. – jwd Apr 23 '21 at 05:28
  • @jwd OT: Better to use `std::aligned_storage` or `std::aligned_union`. – Daniel Langr Apr 23 '21 at 05:29

1 Answers1

10

To destroy a Derived, conceptually Derived::~Derived is called (which does nothing in this case), then the vtable is adjusted so that the object is an Interface, then Interface::~Interface is called. What you are observing is the pointer to the Interface vtable (as seen here, constructing an Interface gives the same first byte).

If you enable optimisations, then since Interface::~Interface does nothing, Derived::~Derived can be optimised to a no-op too, and you see the same first byte printed.

Artyer
  • 31,034
  • 3
  • 47
  • 75