I have a JavaFX application that minimizes to tray when the X button is pressed. I have been monitoring the application via VisualVM for memory trends.
The weird part is, when the application is open, or minimized to taskbar, the memory is always GCed back to initial memory used. However, when it is minimized to tray (stage.hide()
, systemTray.show()
), the memory gets GCed, but in an upward trend (leak).
In VisualVM, the Old Gen space keeps going up, and once it hits the max after some time, the application will be unresponsive, and CPU spikes to 80%.
I notice that if I stage.show()
on the app by double clicking the tray icon etc, GC will clear everything back to normal. However, if left for prolonged periods, it will simply fail to GC the old gen.
A heap dump shows javafx.scene.Scene#7
and javafx.scene.Node[]#2
as having the most retained space. Both will not appear if the stage is not hidden. Under references, it shows
this[] -> dirtyNodes()
.
this - value: javafx.scene.Node[] #2
<- dirtyNodes - class: javafx.scene.Scene, value: javafx.scene.Node[] #2
<- value - class: javafx.scene.Node$ReadOnlyObjectWrapperManualFire, value:
javafx.scene.Scene #7
What is causing this and how can I solve this?