Such structures are necessary for real-time applications - for example user interfaces. (Users don't care if clicking a button takes 0.1s or 0.2s, but they do care if the 100th click forces an outstanding lazy computation and takes 10s to proceed.)
I was reading Okasaki's thesis Purely functional data structures and he describes an interesting general method for converting lazy data structures with amortized bounds into structures with the same worst-case bounds for every operation. The idea is to distribute computations so that at each update some portion of unevaluated thunks is forced.
I wonder, is there any such implementation of standard collections (Map
, Set
, etc.) in Haskell?
The containers package says
The declared cost of each operation is either worst-case or amortized, but remains valid even if structures are shared.
so there is no guarantee for the worst-case bounds for a single operation. There are strict variants like Data.Map.Strict
, but they're strict in their keys and values:
Key and value arguments are evaluated to WHNF; Keys and values are evaluated to WHNF before they are stored in the map.
there is nothing about (possible) strictness of its structure.