A good way to compare two sampling session is to compare hot method histogram. It could be done in VisualVM or with following SJK commands.
sjk ssa --histo --by-term -f OldHibernateLibrary_30min.nps
Trc (%) Frm N Term (%) Frame
64450 53% 64450 64450 53% java.util.concurrent.LinkedBlockingQueue.poll(Unknown Source)
22503 18% 22503 22503 18% sun.nio.ch.SocketChannelImpl.read(Unknown Source)
8954 7% 8954 8954 7% sun.nio.ch.SelectorImpl.select(Unknown Source)
6943 5% 6943 6943 5% java.lang.ClassLoader.loadClass(Unknown Source)
3828 3% 3828 3828 3% java.lang.Thread.sleep(Native Method)
1918 1% 1918 1918 1% java.lang.Object.wait(Native Method)
1674 1% 1674 1674 1% sun.nio.ch.SocketChannelImpl.write(Unknown Source)
...
Trc (%) Frm N Term (%) Frame
60427 44% 60427 60427 44% java.util.concurrent.LinkedBlockingQueue.poll(Unknown Source)
28568 21% 28568 28568 21% java.lang.ClassLoader.loadClass(Unknown Source)
23072 17% 23072 23072 17% sun.nio.ch.SocketChannelImpl.read(Unknown Source)
6181 4% 6181 6181 4% sun.nio.ch.SelectorImpl.select(Unknown Source)
3030 2% 3030 3030 2% java.lang.Thread.sleep(Native Method)
1542 1% 1542 1542 1% sun.nio.ch.SocketChannelImpl.write(Unknown Source)
1451 1% 1451 1451 1% java.util.concurrent.LinkedBlockingQueue.take(Unknown Source)
...
sjk ssa --histo --by-term -f LatestHibernateLibrary_30min.nps
Briefly browsing though histograms I can spot java.lang.ClassLoader.loadClass
growing from 5% to 21% (please mind that this a percents from total sample count, they do not translate into CPU usage).
Assuming both snapshot are take under same load (which I cannot verify form VisualVM snapshot), I can conclude that java.lang.ClassLoader.loadClass
is a culprit in CPU usage degradation.
Filtering histogram further
sjk ssa --histo --by-term -f OldHibernateLibrary_30min.nps -tf java.lang.ClassLoader.loadClass
sjk ssa --histo --by-term -f LatestHibernateLibrary_30min.nps -tf java.lang.ClassLoader.loadClass
I can see no difference between old and new version, e.i. usage pattern stay same between versions.
From histogram I can see all paths to java.lanf.ClassLoader.loadClass
go though org.springframework.orm.jpa.SharedEntityManagerCreator$SharedEntityManagerInvocationHandler.invoke
, so problematic path is below
java.lang.ClassLoader.loadClass(Unknown Source)
org.springframework.util.ClassUtils.isVisible(Unknown Source)
org.springframework.util.ClassUtils.getAllInterfacesForClassAsSet(Unknown Source)
org.springframework.util.ClassUtils.getAllInterfacesForClassAsSet(Unknown Source)
org.springframework.orm.jpa.ExtendedEntityManagerCreator.createProxy(Unknown Source)
org.springframework.orm.jpa.ExtendedEntityManagerCreator.createProxy(Unknown Source)
org.springframework.orm.jpa.ExtendedEntityManagerCreator.createApplicationManagedEntityManager(Unknown Source)
org.springframework.orm.jpa.AbstractEntityManagerFactoryBean.invokeProxyMethod(Unknown Source)
org.springframework.orm.jpa.AbstractEntityManagerFactoryBean$ManagedEntityManagerFactoryInvocationHandler.invoke(Unknown Source)
com.sun.proxy.$Proxy671.createEntityManager(Unknown Source)
com.spmsoftware.appframework.persistence.MultitenantEntityManagerFactory.createEntityManager(Unknown Source)
org.springframework.orm.jpa.SharedEntityManagerCreator$SharedEntityManagerInvocationHandler.invoke(Unknown Source)
Conclusions
VisualVM sampling show increase time spent in java.lanf.ClassLoader.loadClass
method. Unfortunately it is a limit of thread dump based sampling you cannot pick inside native method.
java.lanf.ClassLoader.loadClass
utilization is extremely high in both old a new method, which makes me this about some miscommunication in framework.
java.lanf.ClassLoader.loadClass
high times are likely to be result of contention between threads not real CPU usage. Though relative change in metric gives us a reason to consider it related to CPU usage grow root cause.