0

I have a huge file containing at least 10 millions of lines and I need to search a particular unique word (ID) from that file using java. Please suggest me the best and fastest way which will consume very less processing time to achieve this.

Is java.util.concurrent package will be useful. If yes please suggest me how?

2 Answers2

0

You could split the reading and searching operation into separate threads. That way, you searching will not interrupt your reading. However, as IO will be the major bottleneck, I doubt you will see much of an performance increase.

PeterK
  • 1,697
  • 10
  • 20
0

Try the following code: hopefully it'll finish within your time limit of 1min

String search="searchMe";     
String  thisLine = null;
      try{
         // open input stream test.txt for reading purpose.
         BufferedReader br = new BufferedReader("MYFILE");
         boolean found=true;
         while ((thisLine = br.readLine()) != null) {
            if(thisLine.equals(search)){
              System.out.println(search + " was found.");
              found=true
              break;
            }
         }
         if(!found){
             System.out.println("not found");
         }
         br.close();       
      }catch(Exception e){
         e.printStackTrace();
      }

It might be faster if you use some other program to split the file into smaller "temporary" files. and use threads to work on each file.(Not 100% sure if that would work).

Obviously the best solution would be to be able to edit the program that generates this file and make it to create few different files (possibly in different hard drives). then u can easily use threading to increase the speed.

nafas
  • 5,283
  • 3
  • 29
  • 57