When review colleague's code , found below code
BufferedReader br = new BufferedReader(new FileReader(PATH + fileName));
//...
just read a file and concat these lines as a one line, but I do not found any close code, So I think it should cause resource leak,and finally cause too many open files error
, so to prove this, I write a test
for (int i = 0; i < 7168; i++) { // ulimit -n ==> 7168
BufferedReader br = new BufferedReader(new FileReader("src/main/resources/privateKey/foo.pem"));
System.out.println(br.readLine());
}
System.in.read();
Very strange, everything is ok, does not throw expected exception.
And check the real opened files in command line
➜ ~ lsof -p 16276 | grep 'foo.pem' | wc -l
2538
why is only 2538, not 7168?
So what's wrong? how to cause the too many open files error
?
As @GhostCat suggested, change 7168 --> Integer.MAX_VALUE, this time it caused
java.io.FileNotFoundException: src/main/resources/privateKey/foo.pem (Too many open files in system)
at java.io.FileInputStream.open0(Native Method)
at java.io.FileInputStream.open(FileInputStream.java:195)
when i is 27436
, and in this case check the real opened files in command line is
➜ ~ lsof | grep foo.pem | wc -l
7275
but where are left files(27346 - 7275)? and why ulimit number does not work?