I've been analyzing a heap dump due to some recent reported slowness. Turns out there's a couple of 1.5GB byte arrays that get lodged there and I can´t trace where they come from. Eclipse's MAT doesn´t show me the class that is holding such a huge chunk. It just says "<system class loader>". Perhaps I'm not looking in the right place.
What could be causing these two massive byte arrays being kept there? The app runs on Websphere Application Server. Here's some JVM info about the environment I'm running the app in. Thanks
- Compiled in: JVM 1.6
- Environment Java version : JRE 1.7.0 Linux amd64-64 build 20130421_145945 (pxa6470sr4fp1ifix-20130423_02(SR4 FP1+IV38579+IV38399+IV40208) )
- Virtual machine version : VM build R26_Java726_SR4_FP1_2_20130421_2353_B145945
Just-In-Time(JIT) compiler switch, Ahead-Of-Time (AOT) compiler switch, Compiler version : r11.b03_20130131_32403ifx4 - Garbage collector version : GC - R26_Java726_SR4_FP1_2_20130421_2353_B145945_CMPRSS
- Java Heap Information
- -Xmx (Maximum Java heap size) : 4096m
- -Xms (Initial Java heap size) : 1024m
- -Xscmx (Java class data sharing cache size) : 90M
- -Xscmaxaot (Maximum number of bytes in the cache that can be used for AOT data) : 4M
Edit:
These two byte arrays instances are not defined directly in my code. I do however use these methods constantly, within a util class, in order to read and write payroll bulk files. I wonder whether the constant creation of temporary files and input streams has anything to do with it.
public abstract class IOUtils {
public static Logger log = LoggerFactory.getLogger(IOUtils.class);
private IOUtils() {}
public static String readFirstLine(File f) {
String line = null;
BufferedReader reader = null;
try {
reader = new BufferedReader(new FileReader(f));
line = reader.readLine();
} catch (IOException e) {
log.warn("error reading header", e);
} finally {
if (reader != null) {
try {
reader.close();
} catch (IOException e) {
log.warn("error closing file", e);
}
}
}
return line;
}
public static File multipartFileToFile(MultipartFile mFile) throws IOException {
File convFile = File.createTempFile(mFile.getOriginalFilename(), ".tmp");
convFile.createNewFile();
org.apache.commons.io.IOUtils.copy(mFile.getInputStream(), new FileOutputStream(convFile));
return convFile;
}
public static File multipartFileToFile(MultipartFile mFile, String suffix) throws IOException, IllegalStateException {
File tmpFile = File.createTempFile(mFile.getOriginalFilename(), suffix);
mFile.transferTo(tmpFile);
return tmpFile;
}
public static byte[] readBytes(File file) throws IOException {
return org.apache.commons.io.IOUtils.toByteArray(new FileInputStream(file), file.length());
}
public static void writeBytes(File file, byte[] data) throws IOException {
org.apache.commons.io.IOUtils.write(data, new FileOutputStream(file));
}
}
Update: I Googled the size of the bytearray and found that the source of the issue was not my code, but JGroups, which we were using for shared cache and ended up removing, so the problem went away. Here's the JGroups Issue in question: