1

I get this error from time to time when I run ghcjs on a large-ish code base (~10k LOC, lots of template haskell):

fd:40: hPutBuf: illegal operation (handle is closed)

my best bet is that this is an out-of-memory error. when i re-run the build it usually works the second or third time (probably because it can build on the partial result from the previous runs?).

now i am wondering if there is a way to give node more memory than the 1GB (i think?) that it gets by default. there is --max_old_space_size, which i found when searching here for "increase node memory", but i haven't found a way to pass that argument to node as called from ghcjs.

thanks!

EDIT: possibly related: https://github.com/ghcjs/ghcjs/issues/601, https://github.com/ghcjs/ghcjs/issues/588

user2645074
  • 107
  • 8
  • 4
    That error message doesn't *sound* like it is out of memory. It sounds like it is trying to write to a file that has already been closed: perhaps a race condition in ghcjs or in your usage of it? – amalloy Aug 22 '17 at 07:04
  • ok, you are right, that interpretation also makes sense. what would be a good way to find more evidence here? – user2645074 Aug 22 '17 at 07:24
  • On that topic I'm afraid I have no clever ideas. You could do what you're doing now: try giving it more memory and see if that helps. – amalloy Aug 22 '17 at 07:38
  • yes, but that was my original question. (-: – user2645074 Aug 22 '17 at 09:07
  • i.e., how do i call ghcjs so that it gives more memory to node? – user2645074 Aug 22 '17 at 09:08
  • GHCJS doesn't seem to permit passing options to node (see e.g. [here](https://github.com/ghcjs/ghcjs/blob/ee742e015edb8ba8bc443d7869361f312ddb4cd7/src/Gen2/TH.hs#L695-L698), where it starts node to compile TH). Based on my cursory glance at the source code, you have two options: make a node alias which calls the node executable with the appropriate 'extra memory' flag; or modify GHCJS itself to accept an additional flag which is then passed to node at that point (and maybe others) - this could be far from trivial! – user2407038 Aug 22 '17 at 23:58
  • that's useful, thanks! writing a wrapper script was actually not too hard (since i couldn't get the argv escaping done in sh or bash i just did it in python). the problem still happens with 8GB, so the theory that there is some open files ulimit or something similar, more node-specific is still in the race. but i coulnd't find a flag for that even in the node options. but shouldn't handle limits be far higher by default than what's needed for a couple of template haskell threads? – user2645074 Aug 23 '17 at 22:31

0 Answers0