I recently found out that bits in DRAMs can be randomly flipped by the decay of particles in it or by cosmic rays. And I wondered how often this errors occur.
Unfortunately, the most recent statistic I found is from the year 1990 (source), which states that an error should occur every month per 128MB of memory.
Since I couldn't find any recent statistic of soft error rates in modern RAMs, I tried to write a program in java to measure the soft error frequency on 4GB of my RAM. I expect the program to work to detect every soft error in the allocated 4GB of RAM, if it weren't optimized in any way.
The problem is, I have no idea how to check if the program works (I assume it doesn't because of optimization), and I don't know how to change it to work as expected.
Based on the 1990's statistic I should expect to detect an error every 22 hours, thus I would need to run the program for almost a week to state with a 99% confidence that it works. Assuming that modern hardware doesn't have a better soft error rate than in the 90s.
The following loop is the most important part of my program:
int[] memory = new int[1_073_741_824]; // 4GB array, each value initialized to 0
while (true) {
for (int i = 0; i < memory.length; i++) {
if (memory[i] != 0) {
// soft error found
memory[i] = 0;
// print information about the error in a log file
}
}
// Sleep for a minute
}
What can I do to avoid optimization to break the intended use of the program?
P.S. If you think that my code wouldn't even work without optimization, please explain why.