Maximal munch strikes again.
[lex.pptoken]/p3:
If the input stream has been parsed into preprocessing tokens up to a
given character:
- [two exceptions not relevant here]
- Otherwise, the next preprocessing token is the longest sequence of characters that could constitute a preprocessing token, even if that
would cause further lexical analysis to fail, except that a
header-name (2.8) is only formed within a
#include
directive (16.2).
The problem is that 0e1_e+0
, unlike 0e1_a+0
, is a valid preprocessing number ([lex.ppnumber]):
pp-number:
digit
. digit
pp-number digit
pp-number identifier-nondigit
pp-number ’ digit
pp-number ’ nondigit
pp-number e sign
pp-number E sign
pp-number .
As a result, 0e1_e+0
is parsed as a single pp-number preprocessing token, and then explodes later because it cannot be converted to a valid token (for obvious reasons).
0e1_a+0
, on the other hand, is parsed as three tokens, 0e1_a
, +
, and 0
, and all is well.