I am trying to read a text file which has around 3 lakh lines as of now.
How am I reading?
I am reading using the java.io.BufferedReader
Here is a small code snippet which represents my approach.
int lineNumber = 1;
BufferedReader br = null;
String currentLine = null;
br = new BufferedReader(new FileReader(f));//here f will be the file name to be read, I have passed
while ((cuurentLine = br.readLine()) != null) {
//here I have written logic to do processing after reading 1000 lines
//line number = 1001 start processing, similarly it reads next 1000 lines, each line is put in a List collection
//after reaching 1001 line clearing list and continuing the loop
}
I have tried using NIO2 the following case
br = Files.newBufferedReader(Paths.get(inputFileName), StandardCharsets.UTF_16);
It resulted in the follwoing exception
exception :Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
at java.util.Arrays.copyOf(Unknown Source)
at java.lang.AbstractStringBuilder.expandCapacity(Unknown Source)
at java.lang.AbstractStringBuilder.ensureCapacityInternal(Unknown Source)
at java.lang.AbstractStringBuilder.append(Unknown Source)
at java.lang.StringBuffer.append(Unknown Source)
at java.io.BufferedReader.readLine(Unknown Source)
at java.io.BufferedReader.readLine(Unknown Source)
at TexttoExcelMerger.readFileLineByLine(TexttoExcelMerger.java:66)
at TexttoExcelMerger.main(TexttoExcelMerger.java:255)
Firstly, Is my approach right?
Are there any efficient and fast approaches in NIO2, apache FileUtils or any other API for reading a file faster, which improves my file reading
process faster. Can I read set of lines like first 1000 like
br.readFirst(1000);
,
but without reading line by line or iterating as in my logic?