1

jhat is a great tool for analyzing Java heap dumps, but for large heaps its easy to waste a lot of time. Give jhat a runtime heap too small, and it may take 15 minutes to fail and run out of memory.

What I'd like to know is: is there is a rule of thumb for how much -Xmx heap I should give jhat based on the size of the heapdump file? Only considering binary heap dumps for now.

Some very limited experimentation indicates that its at least 3-4 times the size of the heap dump. I was able to analyze a three-and-change gigabyte heap file with -J-mx12G.

Does anyone else have more conclusive experimental data, or an understanding of how jhat represents heap objects at runtime?

data points:

  • this thread indicates a 5x overhead, but my experimentation on late model jhats (1.6.0_26) indicates its not quite that bad
  • this thread indicates a ~10x overhead
  • a colleague backs up the 10x theory: 2.5gb heap file fails with a -J-mx23G
  • yet another colleauge got a 6.7 gb dump to work with a 30 gb heap, for a 4.4x overhead.
trincot
  • 317,000
  • 35
  • 244
  • 286
Adam Lehenbauer
  • 309
  • 4
  • 12
  • Have you tried the Eclipse Memory Analyzer as an alternative to jhat? – Joel Oct 05 '11 at 21:58
  • 1
    The Eclipse Memory Analyzer is good. An other alternative to open big heap dumps is [SAP Memory Analyzer](http://www.sdn.sap.com/irj/scn/downloads?rid=/library/uuid/a0f47c83-5ef6-2910-2c89-b75d296edef9) or [YourKit](http://www.yourkit.com/). The last is not free. – DarkByte Oct 05 '11 at 22:22
  • I've used the Memory Analyzer, it's quiet good. – Drizzt321 Oct 05 '11 at 22:33
  • Ok, ok, I will definitely check out Eclipse's MAT. I am still interested in getting a handle on jhat though. The ability to drop a dump on any machine, fire up jhat and send around a link is very helpful. – Adam Lehenbauer Oct 06 '11 at 15:46
  • Aren't Eclipse Memory Analyzer and SAP's project the same thing? – Big Rich Oct 30 '12 at 14:43

0 Answers0