The only way I know of to get a multiplier effect in software performance (without different hardware) is to exploit Amdahl's Law which just says if you make something take less time it's faster. (Boy, that's pretty deep.)
If you reduce the time the software takes by a fraction X, then you have given it a speedup ratio of R = 1/(1-X).
For example, if a program takes 100 seconds, and you manage to shave off 60, then you're left with it taking 40 seconds, which is 100/40 = 2.5 times as fast as it was.
Here's how X and R are related:
X R
0.0 1
0.1 1.11..
0.2 1.25
0.3 1.43..
0.4 1.66..
0.5 2
0.6 2.5
0.7 3.33..
0.8 5
0.9 10
0.99 100
1.0 inf
Also, these effects compound.
If you reduce time by 50%, the speed is multiplied by 2.
If you take the result of that, and reduce its time by 50%, the speed is multiplied again by 2.
Notice that the second 50% was only 25% of the original time.
The first time reduction made the second one bigger, by a factor of 2 !
Here's an example where that speed-compounding magnification effect produced a speedup of 730 times.