Since C99, C now has a proper Boolean type, _Bool
. Objective-C, as a strict superset of C, inherits this, but when it was created back in the 1980s, there was no C Boolean type, so Objective-C defined BOOL
as signed char
.
All of Cocoa uses BOOL
, as does all non-NeXT/Apple Cocoa code that I've seen. Obviously, for compatibility with existing protocols (e.g., -applicationShouldTerminateAfterLastWindowClosed:
from NSApplicationDelegate
), matching the already-declared type is preferable, if for no other reason than to avert a warning.
For cleanliness/readability purposes, stdbool.h
defines bool
as a synonym for _Bool
, so those of us who don't want unnecessary underscores in our code can use that.
Three other useful notes:
@encode(_Bool)
evaluates to"B"
. (@encode(BOOL)
evaluates to"c"
, forsigned char
.)sizeof(_Bool)
evaluates to1
, which follows from C99's definition that_Bool
is only as large as necessary to hold its two possible values. (Edit: Actually, the standard says only that it must be “large enough” to hold those two values; it does not place an upper bound, and, in fact, Mac OS X on 32-bit PowerPC defines it as 4 bytes. Size difference is another thing to file under possibleBOOL
-vs.-bool
compatibility issues.)- On that note, the only two possible values of a
_Bool
are 1 and 0. Any other values are converted to one of these on assignment, as if you had done a double-negation (!!
) or tested inequality against 0 (!= 0
). The only ways to get a_Bool
with some other value are the usual magicks: Pointer aliasing and unions.
Is there any reason not to use _Bool
/bool
in new code?