I was reading a textbook which says:
Assume the probability of misprediction is p, the time to execute the code without misprediction is TOK and the misprediction penalty is TMP. Then the average time to execute the code as a function of p is:
Tavg(p) = (1− p)TOK + p(TOK + TMP)
I'm a little bit confused, isn't it should be:
Tavg(p) = (1− p)TOK + pTMP
for example, let's say p is 0.5, it takes 10 clock cycles for cpu when branch prediction is correct and it takes 20 clock cycles for cpu when branch prediction is incorrect, isn't the average clock cycles is 0.5(10+20) = 15 clock cycles?