Inferring a type inside a type definition appears to follow both paths in a ternary when the inferred Type is enumerable (vs uncountable).
In the code below, when the inferred type can be enumerated (is a union of multiple values), the type definition appears to take two paths and return a union of the true and false paths of the second (interior) ternary operation. However, when the inferred type is uncountable (numeric, string - so infinite), only one of the interior ternary paths is taken.
Note that in the alt
case when there is no infer
the type is narrowed as expected.
What's going on here? Why does TypeScript behave this way in the presence of infer
?
I'm not saying this is incorrect behavior, I'm just not even sure where this is documented or how to find it in the docs.
Thanks.
// type of union1 == 0 | 1; type of alt1 == 0
var union1:
boolean extends infer Type ?
Type extends true ?
1 :
0 :
never;
var alt1: boolean extends true ? 1 : 0;
// type of union4 == 0; type of alt4 == 0
var union4:
string extends infer Type ?
Type extends "a" ?
1 :
0 :
never;
var alt4: string extends "a" ? 1 : 0;