According to http://www.reddit.com/r/programming/comments/gwqa2/the_real_point_of_laziness/c1rslxk
Some algorithms don't terminate in an eager language, that do in a lazy one, and (a mild shocker for me to find,) vice-versa.
The former is of course well known, but the latter strikes me as, if true, considerably more than a mild shocker.
Does anyone know an algorithm that terminates in an eager language but not in a lazy one?