I am running 1.6.0_25 64bit on windows 7 64bit.
I am trying to have the JVM run with maximum memory because my application is very memory intensive... but sadly... the memory allocation does not build up and there is a lot of Page Faults from windows to keep bringing the next set of virtual memory in.
I am running java -Xms2G -Xmx3G Test
The following code is my attempt at bringing a whole file in... such that i do not run into page faults for reading in.
File f = new File("veryLARGEfile.txt");
FileInputStream in = new FileInputStream(f);
int i = (int) f.length();
System.out.println("Bytes in file: " + i);
byte[] file = new byte[i];
i = in.read(file);
System.out.println("Bytes read: " + i);
doing this method i can see that in windows task manager the system reaches 2G worth of memory while it is reading the file... but once it is done reading it... the memory falls back down again!!!!
This is a major problem... i need the whole byte array to stay in active memory.
Thank you, ey
I have modified the code to use basic array types int[][] and float[][] to hold my data instead of keeping ArrayLists of an object containing int's and float's.
doing this, i find that my java memory does not get swapped (so, i guess heap memory is treated a bit differently from stack here) [oh, i did change all the code to be static typed as well - i know, very bad programming style]
the issue that i am running into now is how to handle my HashMap... all my attempts of trying to build a lookup table is failing with O(n^2) running time to build!!!