I have simulated 4 different binaries in the SimpleScalar simulation tool and for every binary the L2 unified miss rate is greater than L1 data miss rate.
In my assignment I am suppose to do some analysis. First thing that come to my mind is L2 miss rate should be smaller since it has higher level in hierarchy and more size than L1 cache.
Besides, as far as I know, L2 is referenced only when there is a miss in L1 cache. From my point of view, L2 should have the data that L1 does not have most of the time so its miss rate should be less.
However, results are not close to what I expected.
For instance,
- L1 Data Miss Rate : 0.0269
- L2 Unified Miss Rate : 0.0566
The miss rate is determined as misses / references
to cache.
What is wrong with my approach? Why L2 miss rate is greater than L1?