2

In Practical Clojure, the authors mention that once a lazy seq value is calculated, it is cached.

If we get a very large number of values from a lazy-seq, might we see an out-of-memory error? Or is there a mechanism to prevent that (e.g. older cached values are removed to make room for new ones)?

Tianxiang Xiong
  • 3,887
  • 9
  • 44
  • 63

1 Answers1

3

Realised elements in a lazy sequence are able to be garbage collected like any other object in Clojure, with one important caveat. You should not hold a reference to the head of the sequence. This is known as "holding the head".

In concrete terms using doall to evaluate the whole sequence, or storing a reference (say in an atom or a def) to the lazy sequence while traversing it with map are both holding the head.

Daniel Compton
  • 13,878
  • 4
  • 40
  • 60
  • Care to clarify how does that reconcile or relate to the mentioned caching? – matanster Mar 10 '18 at 11:39
  • I think memoising may be a better term to use than caching. As far as I know, realised seq results are never 'expired' as a cache's results may be. If you don't hold onto the head, the values in the head of the seq can be GC'd because nothing is referencing them. – Daniel Compton Mar 11 '18 at 19:32