Motivating background info: I maintain a C++ library, and I spent way too much time this weekend tracking down a mysterious memory-corruption problem in an application that links to this library. The problem eventually turned out to be caused by the fact that the C++ library was built with a particular -DBLAH_BLAH
compiler-flag, while the application's code was being compiled without that -DBLAH_BLAH
flag, and that led to the library-code and the application-code interpreting the classes declared in the library's header-files differently in terms of data-layout. That is: sizeof(ThisOneParticularClass)
would return a different value when invoked from a .cpp file in the application than it would when invoked from a .cpp file in the library.
So far, so unfortunate -- I have addressed the immediate problem by making sure that the library and application are both built using the same preprocessor-flags, and I also modified the library so that the presence or absence of the -DBLAH_BLAH
flag won't affect the sizeof()
its exported classes... but I feel like that wasn't really enough to address the more general problem of a library being compiled with different preprocessor-flags than the application that uses that library. Ideally I'd like to find a mechanism that would catch that sort of problem at compile-time, rather than allowing it to silently invoke undefined behavior at runtime. Is there a good technique for doing that? (All I can think of is to auto-generate a header file with #ifdef/#ifndef
tests for the application code to #include
, that would deliberately #error
out if the necessary #define
s aren't set, or perhaps would automatically-set the appropriate #define
s right there... but that feels a lot like reinventing automake
and similar, which seems like potentially opening a big can of worms)