-3

I'm trying to write a large(Total no of rows can be up to 2 million) resultset to an .xlsx file. To over come heap space or memory errors, I decided to fetch some limited set of records from table in each call and append the same to the file. Those limited set of records are all ordered by rownum. AT each call to table I'm fetching 50K records and at last after all the records are being fetched from table appending the same to an .xlsx file. At present I'm putting only 600000 records into each sheet. And my current logic is as below.

But the below logic takes lot of time for creating the .xlsx file. Is there anything which can be done to decrease the time?? And it also fails after creating second sheet.

    String query="select count(*) from employee";
    String actquery="";
    Class.forName("oracle.jdbc.driver.OracleDriver").newInstance();
    conn = DriverManager.getConnection(jdbcURL,user,passwd);
    stmt = conn.createStatement();
    rs = stmt.executeQuery(query);
    stmt.setFetchSize(1000);
    int rowcounter;
    int increaseCount=50000;
    int jdbclimit=50000;//jdbc
    int excelrowlimit=600000;
    int excellimit=600000;
    int rownumStartCnt=0;
    int rownumEndCnt=0;
    if(rs.next()){
    outStream = new FileOutputStream("D:\\Test.xlsx",true);
    int count=rs.getInt(1);

    System.out.println("Total records"+count);
    if(count>0)
    {
    int sheets= count/excelrowlimit;
    if(count%excelrowlimit>0){
           sheets = sheets+1;
    }
    for(int scount=1;scount<=sheets;scount++){
    sheet =  workbook.createSheet("GapHistory-"+scount);
    rowcounter=1;
    System.out.println("Sheet:"+scount);
    System.out.println("limit is "+jdbclimit);
    while(rowcounter<excelrowlimit)
    {
    rownumStartCnt=rownumEndCnt+1;
    rownumEndCnt=jdbclimit;
    jdbclimit=jdbclimit+increaseCount;
    if(rownumEndCnt>excellimit){
           rownumEndCnt=excellimit;
    }
    if(rownumStartCnt>count){
     break;
    }
    PreparedStatement stmt2 = null;
    ResultSet rs2 =null;
    ResultSetMetaData rsmd=null;              

    actquery=actquery+"SELECT * FROM (SELECT t.*, ROW_NUMBER() OVER (ORDER BY employee_number) rnum  FROM employee t) WHERE rnum BETWEEN ? and ?";
    stmt2 = conn.prepareStatement(actquery);
    stmt2.setInt(1,rownumStartCnt);
    stmt2.setInt(2,rownumEndCnt);
    rs2 = stmt2.executeQuery();
    System.out.println("Start row-->"+rownumStartCnt+"End row--->"+rownumEndCnt);
    rsmd = rs2.getMetaData();
    stmt2.setFetchSize(1000);
    rs2.setFetchDirection(ResultSet.FETCH_FORWARD);
    while(rs2.next())
    {

    if(rowcounter==1)
    {
           row = sheet.createRow(rowcounter);
           for(int i=1;i<=68;i++)
           {
            cell = row.createCell(i);
            cell.setCellValue(rsmd.getColumnName(i));                                                                                                                                         

           }

    }
    rowcounter++;

    for(int i=1;i<=68;i++)
    {
           if(i==1)
           {
           row = sheet.createRow(rowcounter);
           }

           cell = row.createCell(i);
           cell.setCellValue(rs2.getString(i));

    }
    }
    actquery="";

                        }
 excellimit = excellimit+excelrowlimit;// Got rid with exception after adding this line
                        }
          }
    }
Rajshekar
  • 1
  • 2
  • 2
    Please format your code properly. You are asking strangers for help, the very least you can do is make your code readable. – Jim Garrison Dec 06 '17 at 05:26

1 Answers1

1
java.sql.SQLException: ORA-01000: maximum open cursors exceeded

This means you are opening many ResultSets and not closing any of them. Eventually you run out of driver resources. You must close each ResultSet when you are done with it.

And BTW, XLSX has a maximum row count of 2^20, or 1,048,576. You cannot store 2 million rows in an XLSX worksheet.

Jim Garrison
  • 85,615
  • 20
  • 155
  • 190
  • Thanks for your reply. Yes I'm aware about the xlsx limitations. That's the reason I'm creating a new sheet when I get a records greater than 600000. Basically I'm trying to put only 600000 records into each excel sheet. And more over the error is due to the logic which has not followed correctly. Look at the output log you will get to know. – Rajshekar Dec 06 '17 at 05:46
  • I've removed the exception now. But needed help in minimizing the time in generating the .xlsx file. – Rajshekar Dec 06 '17 at 07:27
  • Please[edit] your post and (1) format the code correctly; and (2) update the narrative to reflect the current situation, I.e. remove the part about the exception. – Jim Garrison Dec 06 '17 at 08:46
  • Edited the post with necessary comments. Now I want to reduce the time in creation of excel. Please guide me. Is there any option that writing to the file can also be done as soon as the first set of records fetched from resultset instead of writing to the file at once. – Rajshekar Dec 06 '17 at 09:21