10

A 2010 report to the U.S. President and Congress about past and future advances in information technology notes that:

While improvements in hardware accounted for an approximate 1,000 fold increase in calculation speed over a 15-year time-span, improvements in algorithms accounted for an over 43,000 fold increase."

I am intrigued by how they came up with this figure. It quotes research by Professor Martin Grötschel of Konrad-Zuse-Zentrum für Informationstechnik Berlin, but I couldn't find the specific source.

This report was also cited by the New York Times here, where the title reads "Software Progress Beats Moore’s Law" and the author claims that the "[the findings] sharply contradict the conventional wisdom."

Either way, the conclusion in the report seems to be that software improvements account for more increase in calculation speed than hardware improvements (apparently in the period between 1988 and 2003).

Is this true? How did they measure it?

  • 2
    I’m only skeptical of the claim that this contradicts conventional wisdom. In fact, it’s very much what I’d expect from experience. – Konrad Rudolph Apr 22 '12 at 17:38
  • @KonradRudolph I think the NYT meant amongst people who *use* software, not people who **make** software. I'm pretty sure most CS departments derive the benefit of algorithmic improvement versus hardware improvement the day after they show you how to nest for() loops, but those darn users have a history of complaining that version 3 is slower than version 2 despite having all these shiny new features :) – Tacroy Apr 23 '12 at 18:03
  • 1
    @Tacroy I don’t think so. Software users don’t know Moore’s law, don’t know what algorithms are and don’t know at what pace computers get faster. They have no “conventional wisdom” in a domain that they know nothing about. of course we can debate this point endlessly but I do believe NYT was referring to expert opinion here since this is usually how the phrase “conventional wisdom” is used. Otherwise they’d have said “commonly held belief” or something similar. – Konrad Rudolph Apr 23 '12 at 18:21
  • In case of improving algorithms complexity, you cannot really say "*X* fold increase". Say you develop *O(log n)* for something that used to be done in *O(n)*, speed increase in absolute numbers depends on problem size. – vartec Apr 27 '12 at 09:55

1 Answers1

14

This is a misquote. It does not come from the report that you cite. It is a paraphrase without enough context

The real quote, in context, is far less exciting (emphasis mine, link in question):

In the field of numerical algorithms, however, the improvement can be quantified. Here is just one example, provided by Professor Martin Grötschel of Konrad-Zuse-Zentrum für Informationstechnik Berlin.
Grötschel, an expert in optimization, observes that a benchmark production planning model solved using linear programming would have taken 82 years to solve in 1988, using the computers and the linear programming algorithms of the day. Fifteen years later – in 2003 – this same model could be solved in roughly 1 minute, an improvement by a factor of roughly 43 million. Of this, a factor of roughly 1,000 was due to increased processor speed, whereas a factor of roughly 43,000 was due to improvements in algorithms! Grötschel also cites an algorithmic improvement of roughly 30,000 for mixed integer programming between 1991 and 2008.

The idea that one limited subclass of algorithms (i.e. to perform a certain form of mathematical modelling) may have improved does not 'sharply contradict[...] the conventional wisdom'.

Oddthinking
  • 140,378
  • 46
  • 548
  • 638
  • Thanks. I'm still wondering how they calculated those numbers. In particular I am wondering what algorithm they refer to that is assumed to have come out after 1988. Also I wonder to what extent is it generalizable the claim that `Software Progress Beats Moore's Law` (from the summary in the original report) from this particular example. – Amelio Vazquez-Reina Apr 24 '12 at 13:17