1

I got a note a from one of our application teams about some log entries after updating to JBOSS EAP 7.1.0 Domain mode from EAP 7.0.11. They're claiming it is a sign that the Application or JBOSS is running out of memory. I have checked everything on the server itself and it seems fine, no errors, other applications are working as designed. I also checked for custom logging and didn't see anything in the profile other then default loggers. I Didn't really see anything on the Internet or in the JBOSS books about this entry so I figured I would ask since this still a pretty new release. Thanks in Advance for the help!

  • RHEL 7.4
  • JBOSS EAP 7.1.0
  • OpenJDK 1.8.0_151

This is an example of the log messages:

2018-01-23 08:19:36,094 INFO  [stdout] (default task-24) free memory: 8,121 allocated memory: 241,152   percentage left: 3.368  
2018-01-23 08:19:43,995 INFO  [stdout] (default task-26) free memory: 5,368 allocated memory: 241,152   percentage left: 2.226  
2018-01-23 08:19:48,347 INFO  [stdout] (default task-31) free memory: 2,668 allocated memory: 241,152   percentage left: 1.107  
2018-01-23 08:24:41,693 INFO  [stdout] (default task-2) free memory: 1,947  allocated memory: 241,152   percentage left: 0.808  
2018-01-23 08:29:27,092 INFO  [stdout] (default task-2) free memory: 7,279  allocated memory: 241,152   percentage left: 3.019  
2018-01-23 08:29:34,759 INFO  [stdout] (default task-8) free memory: 3,944  allocated memory: 241,152   percentage left: 1.636  
2018-01-23 08:36:28,319 INFO  [stdout] (default task-6) free memory: 3,585  allocated memory: 241,152   percentage left: 1.487  
2018-01-23 08:40:34,410 INFO  [stdout] (default task-27) free memory: 11,838    allocated memory: 241,152   percentage left: 4.909  
JonRoyer2450
  • 96
  • 1
  • 8
  • Yeah, google shows nothing :-/ my only guess would be it's showing garbage collection trace? We're barely getting upgraded to JBoss EAP 7 so unfortunately I havent run into that before - though I have seen tomcat servers log that information. – JGlass Jan 24 '18 at 13:55
  • That looks like a "System.out.println" somewhere ... you're sure that's not from your application code? I'd rather be suprised if JBoss would log something via STDOUT... – tom Jan 24 '18 at 14:57
  • @tom , That was the argument I kept putting up to them as well. I truly think it's application based, I asked them to review the code yesterday. Thanks for backing me up! :-) – JonRoyer2450 Jan 25 '18 at 17:13
  • When troubleshooting why the Out of memory (OOM) has been occurred one must look at a few factors on the system. Below reasons: 1# Spike in memory usage based on a load event (additional processes are needed for increased load). 2# Spike in memory usage based on additional services being added or migrated to the system. (Added another app or started a new service on the system) 3#Spike in memory usage due to failed hardware 4#Spike in memory usage due to undersizing of hardware resources for the running application(s). 5#There's a memory leak in a running application. – Anup Dey Jan 31 '18 at 08:52

0 Answers0