A 2010 report to the U.S. President and Congress about past and future advances in information technology notes that:
While improvements in hardware accounted for an approximate 1,000 fold increase in calculation speed over a 15-year time-span, improvements in algorithms accounted for an over 43,000 fold increase."
I am intrigued by how they came up with this figure. It quotes research by Professor Martin Grötschel of Konrad-Zuse-Zentrum für Informationstechnik Berlin
, but I couldn't find the specific source.
This report was also cited by the New York Times here, where the title reads "Software Progress Beats Moore’s Law" and the author claims that the "[the findings] sharply contradict the conventional wisdom."
Either way, the conclusion in the report seems to be that software improvements account for more increase in calculation speed than hardware improvements (apparently in the period between 1988 and 2003).
Is this true? How did they measure it?