While reading merits and demerits of using a #define macro vs an equivalent const type I encountered contradictory statements.
Preprocessor replaces instance of macro with its values and as many object copies are created as many instances whereas using a const leads to only one object copy.
Compilers do not set aside storage for constant integral types. Also macros do not do any unnecessary memory allocation.
Example quoted for statement 1
#define PI 3.14
const float pi = 3.14;
Example quoted for statement 2
#define COUNT 10
const int count = 10;
Is optimization behavior dependent on types used under context as it looks from quoted examples? Why?