I think that Haskell is a beautiful language, and judging by the benchmarks, its implementations can generate fast code.
However, I am wondering if it is appropriate for long-running applications, or would chasing all the potential laziness-induced leaks, that one might ignore in a short-lived application, prove frustrating?
This Reddit comment echos my concerns:
As soon as you have more than one function calling itself recursively, the heap profile ceases to give you any help pinpointing where the leak is occurring.
(That whole discussion seems insightful and frank)
I am personally interested in high-performance computing, but I guess servers and HPC have this requirement in common.
If Haskell is appropriate for such applications, are there any examples proving this point, that is applications that
- need to run for days or weeks, therefore requiring the elimination of all relevant leaks (The time the program spends sleeping or waiting for some underlying C library to return obviously doesn't count)
- are non-trivial (If the application is simple, the developer could just guess the source of the leak and attempt various fixes. However, I don't believe this approach scales well. The helpfulness of the heap profile in identifying the source of the leak(s) with multiple [mutually] recursive functions seems to be of particular concern, as per the Reddit discussion above)
If Haskell is not appropriate for such applications, then why?
Update: The Yesod web server framework for Haskell, that was put forth as an example, may have issues with memory. I wonder if anyone tested its memory usage after serving requests continuously for days.