I'm doing this:
import qualified Data.ByteString.Lazy.Char8 as BS
main :: IO ()
main = do
let filename = "sample.txt"
text <- BS.readFile filename
let res = BS.take 1000 $ text
print res
When I run this with profiling it gave me:
162,048 bytes allocated in the heap
2,472 bytes copied during GC
59,688 bytes maximum residency (1 sample(s))
22,232 bytes maximum slop
156 MB total memory in use (0 MB lost due to fragmentation)
File that I read is about 50Kbytes. Why it takes 60Kbytes memory (maximum residency)? I have tried with String and Lazy text, too. It's the same picture. I think Haskell in some way is reading whole file into memory or just allocate as much memory as is the file long. How could I prevent this? I would like to read from it only N bytes and don't want to waste so much memory.