1

I was reading a textbook which says:

Assume the probability of misprediction is p, the time to execute the code without misprediction is TOK and the misprediction penalty is TMP. Then the average time to execute the code as a function of p is:

Tavg(p) = (1− p)TOK + p(TOK + TMP)

I'm a little bit confused, isn't it should be:

Tavg(p) = (1− p)TOK + pTMP

for example, let's say p is 0.5, it takes 10 clock cycles for cpu when branch prediction is correct and it takes 20 clock cycles for cpu when branch prediction is incorrect, isn't the average clock cycles is 0.5(10+20) = 15 clock cycles?

  • 3
    It looks like T_MP only counts the *penalty* of misprediction. That is, the amount of time *wasted* before the code can really start executing. It doesn't include the time to actually execute the code, so that has to be added if you want to get the total time from start to finish. – Nate Eldredge Jul 12 '20 at 01:44
  • 1
    It seems like it'd be simpler to write it as T_OK + p*T_MP. – Nate Eldredge Jul 12 '20 at 01:45

1 Answers1

0

TMP is defined as the misprediction penalty, or the amount of time to recognize the processor took a mistaken path and to get back on track executing the correct path, which does not include the time it takes to actually execute the correct path, ie TOK. That's why when there's a misprediction, it takes TMP + TOK rather than just TMP to complete the execution.

Unn
  • 4,775
  • 18
  • 30