Under C90, if an object held Indeterminate Value, each and every bit could independently be zero or one, regardless of whether or not they would in combination represent a valid bit pattern for the object's type. If an implementation specified the behavior of attempting to read each and every one of the 2ⁿ individual possible bit patterns an object could hold, the behavior of reading an object with Indeterminate Value would be equivalent to reading the value of an arbitrarily chosen bit pattern. If there were any bit patterns for which an implementation did not specify the effect of an attempted read, then the effects of trying to read an object that might hold such bit patterns would be likewise unspecified.
Code generation efficiency could be improved in some cases by specifying the behavior of uninitialized objects more loosely, in a way which would not otherwise be consistent with sequential program execution as specified but would nonetheless meet program requirements. For example, given something like:
struct foo { short dat[16]; } x,y,z;
void test1(int a, int b, int c, int d)
{
struct foo temp;
temp.dat[a] = 1;
temp.dat[b] = 2;
temp.dat[c] = 3;
temp.dat[d] = 4;
x=temp;
y=temp;
}
void test2(int a, int b, int c, int d)
{
test1(a,b,c,d);
z=x;
}
If client code only cares about the values of x
and y
that correspond to values of temp
that were written, efficiency might be improved, while still meeting requirements, if the code were rewritten as:
void test1(int a, int b, int c, int d)
{
x.dat[a] = 1;
y.dat[a] = 1;
x.dat[b] = 2;
y.dat[b] = 1;
x.dat[c] = 3;
y.dat[c] = 1;
x.dat[d] = 4;
y.dat[d] = 1;
}
The fact that the original function test1
doesn't do anything to initialize temp
suggests that it won't care about what is yielded by any individual attempt to read it. On the other hand, nothing within the code for test2 would imply that client code wouldn't care about whether all members of x
held the same values as corresponding values of y
. Thus, such an inference would more likely be dangerous there.
The C Standard makes no attempt to define behavior in situations where an optimization might yield program behavior which, although useful, would be inconsistent with sequential processing of non-optimized code. Instead, the principle that optimizations must never affect any defined behavior is taken to imply that the Standard must characterize as Undefined all actions whose behavior would be visibly affected by optimization, leaving implementor discretion the question of what aspects of behavior should or should not be defined in what circumstances. Ironically, the only time the Standard's laxity with regard to this behavior would allow more efficient code generation outside contrived scenarios would be in cases where implementations treat the behavior as at least loosely defined, and programmers are able to exploit that. If a programmer had to explicitly initialize all elements of temp
to avoid having the compiler behave in completely nonsensical fashion, that would eliminate any possibility of optimizing out the unnecessary writes to unused elements of x
and y
.