I am having a hard time understanding clock cycles. Here is the problem, I am given a program that has two instructions X and Y and I know that X is run 20% of the time and requires 8 clock cycles and the other instruction Y is run 80% of the time and requires 2 clock cycles. If my program has 10 million instructions I need to find:
A. The minimum number of clock cycles to execute one instruction?
B. The maximum speed up using Amdahl's Law that can be found by improving instruction X.
This is my huntch and please help me where I am wrong. For A the minimum number of clock cycles to execute one instruction is 1 clock cycle. I thought I read this somewhere but I am not sure.
To do B I am assuming that I was to solve for the speedup when X has a clock cycle of 1 because that would mean it is executing the fastest.
Are theses assumptions correct? Any help would be appreciated. Thanks