How do the following two blocks of pseudo-code compare in terms of speed for both compiled languages and interpreted languages? (Ignoring number of digits)
Essentially, is there any performance loss by writing variables as a function of several constants, rather than computing beforehand? This often makes code more clear.
permanentNum = (3.1416 / 3) + 1.5708
return permanentNumber / userInputNumber
.
permanentNum = 2.6179
return permanentNumber / userInputNumber
Thanks!