4

Say you have a very deterministic algorithm that produces a list, like inits in Data.List. Is there any way that a Haskell compiler can optimally perform an "indexing" operation on this algorithm without actually generating all the intermediate results?

For example, inits [1..] !! 10000 is pretty slow. Could a compiler somehow deduce what inits would produce on the 10000th element without any recursion, etc? Of course, this same idea could be generalized beyond lists.

Edit: While inits [1..] !! 10000 is constant, I am wondering about any "index-like" operation on some algorithm. For example, could \i -> inits [1..] !! i be optimized such that no [or minimal] recursion is performed to reach the result for any i?

Elliot Cameron
  • 5,235
  • 2
  • 27
  • 34
  • 1
    Well, technically it could evaluate that expression at compile time. In some specific cases, a "sufficiently smart compiler" could perhaps find a closed form expression. But in general, I think this is not possible let alone feasible. I'll leave the reduction to the halting problem to someone else :-) –  Jan 13 '14 at 22:58
  • possible duplicate of [Why are (constant) expressions not evaluated at compile time in Haskell?](http://stackoverflow.com/questions/19259114/why-are-constant-expressions-not-evaluated-at-compile-time-in-haskell) – phs Jan 13 '14 at 22:59
  • [This](http://stackoverflow.com/a/19260999/580412) may be enlightening. – phs Jan 13 '14 at 22:59
  • 1
    @phs Only if it's intended that the example is *completely* constant. I read the question to also ask about related expressions like `\i -> inits [1..] !! i`. –  Jan 13 '14 at 23:00
  • deldan: Yes, that was my intent. – Elliot Cameron Jan 13 '14 at 23:00
  • @3noch Well that's unfortunate, which one is your intent? Scratch that: You just tell us if (and if not, why) that question is a duplicate of yours. –  Jan 13 '14 at 23:02
  • Sorry for being confusing. I edited the question. – Elliot Cameron Jan 13 '14 at 23:07

1 Answers1

7

Yes and no. If you look at the definition for Data.List.inits:

inits                   :: [a] -> [[a]]
inits xs                =  [] : case xs of
                                  []      -> []
                                  x : xs' -> map (x :) (inits xs')

you'll see that it's defined recursively. That means that each element of the resulting list is built on the previous element of the list. So if you want any nth element, you have to build all n-1 previous elements.

Now you could define a new function

inits' xs = [] : [take n xs | (n, _) <- zip [1..] xs]

which has the same behavior. If you try to take inits' [1..] !! 10000, it finishes very quickly because the successive elements of the list do not depend on the previous ones. Of course, if you were actually trying to generate a list of inits instead of just a single element, this would be much slower.

The compiler would have to know a lot of information to be able to optimize away recursion from a function like inits. That said, if a function really is "very deterministic", it should be trivial to rewrite it in a non recursive way.

stonegrizzly
  • 229
  • 1
  • 9
  • 1
    Excellent answer! So the compiler would have to know how to convert between certain types of recursive algorithms and their non-recursive counter-part, as you have done. I wonder if such an optimization is possible... – Elliot Cameron Jan 14 '14 at 01:50