-1

Our server runs into some performance issue so need to do some analyze on it. One solution is making a very huge heap dump, roughly 100G. and then using some profiler tools, JProfiler, Eclipse MAT... to analyze it.

But the problem is those analyzers will occupy very large amount of memory when analyzing heap dump.

Is that possible to use JProfiler/Eclipse MAT to analyze such a big heap dump? What kind of server it needs to finish this process.

trincot
  • 317,000
  • 35
  • 244
  • 286
Jack
  • 9
  • Did you try to open the snapshot? How much memory do you have on the machine where you open the snapshot? You don't need 100G memory to open a 100G snapshot, but if you only have a small fraction of that memory, then there will be a lot of swapping which increases the loading time immensely. – Ingo Kegel Jun 15 '17 at 21:36
  • @Ingo Kegel My server has 16G memory. Just wondering what is the minimum requirements of the memory that I need for analyzing such a huge dump. Thanks! – Jack Jun 16 '17 at 17:58
  • Unfortunately it's not possible to say that in general. Some of the indexes have to be built in memory and their size can vary by orders of magnitude depending on the prevailing reference structures and object layouts. – Ingo Kegel Jun 17 '17 at 07:30

1 Answers1

0

You can run the analysis on the same server, either by remoting X11 over SSH to run jprofiler or eclipse there or by using a headless MAT instance to output HTML reports.

Glorfindel
  • 21,988
  • 13
  • 81
  • 109
the8472
  • 40,999
  • 5
  • 70
  • 122