0

Suppose I have the following preprocessor definition

#define MYNUMBER 10f;

I would then like to use it in my code as follows:

float someResult = MYNUMBER * 3;

When I do this, Xcode thinks that I am trying to use * as a unary pointer mark instead of multiplication sign, and causes an error. What is the correct method of defining such a constant and using it in a multiplicative expression?

Andrew Lauer Barinov
  • 5,694
  • 10
  • 59
  • 83

1 Answers1

8

You shouldn't have a semicolon after your #define. It things MYNUMBER is "10f;".

Jesse Rusak
  • 56,530
  • 12
  • 101
  • 102