0

Space leak is usually defined by execution of a program that consumes more space than necessary. Is such definition compatible with algorithms that require lazy evaluation for amortized time efficiency?

I wonder if these tend to exhibit different pattern in thunk structures. For example, a trivial space leak might look like this:

1 + (2 + 3 + ...)

Would it be normal for a lazy algorithm like tree search produce thunk structures of similar sizes?

I have a suspicion that the proper lazy evaluation patterns would tend to look different making actual leaks easier to spot. For example, the structure could look like this:

[...] : a : b : c

Where [...] part is the prefix that has no references and could have been already garbage collected. In such case, lazy algorithm could well run on O(1) space and there can't be a space leak making the distinction very clear.

I wonder if that's common or there is a wider spectrum of trade-offs to make between space and time efficiency in lazy languages.

EDIT: A possible counter example - persistent structures. Are they easy to distinguish from space leaks?

sevo
  • 4,559
  • 1
  • 15
  • 31
  • It looks to me like your space leak and `O(1)` examples have the same structure (a right fold), just with a different operator. I may be wrong, but I can't see any way for a compiler to tell the difference in a majority of cases. – DarthFennec Apr 12 '19 at 21:31
  • Could you make your examples a bit more specific? – David Young Apr 12 '19 at 22:49
  • 1
    Space leaks are often an emergent property of the program. A somewhat trivial example is the familiar `foldr (+) 0 [1..1000000 :: Int]`, which builds a million-item-long chain of nested thunks, before that gives you an answer. That it does that isn't the fault of the code for `foldr`, nor the code for `+`; both are perfectly reasonable considered independently (we could not, for example, use `foldl'` instead of `foldr` everywhere, we'd just get different leaks). The space leak arises from their use together. Other examples can involve far more pieces. – Ben Apr 16 '19 at 08:21
  • So I don't think the compiler can just look at patterns in the "local" code as it's compiling and spot a space leak. It would need to be a more complicated full-program analysis. But I'm not sure I'm following what you mean by "space leaks consuming memory differently than valid lazy algorithms", or why persistent data structures would be a counterexample, so I don't know if my reasoning is actually addressing your question to provide an answer. – Ben Apr 16 '19 at 08:28

0 Answers0