We are having an issue with processing huge dataset.(It is not that huge, but still, it is huge for us as we never had to pull more than 100k data before with Spring MVC+Oracle).
The requirement is as such, we need to generate reports every month and mail it to client sales team. The Report is generated as Excel file and this is where we are pulling such huge data.
We are using Spring MVC and Oracle as DB. When I run the query in the DB, it pulls the data effortlessly within 2 seconds, consistently. But only when I call it through SimpleJdbcCall or SP in Spring, I'm getting Java Heap Space error. I increased heap space to 2GB but still no use. My system configuration is :
i3-4160 @ 3.6 GHz(Quad Core?) / 8GB RAM / Windows 7 Pro 64bit.
Please help me where I'm going wrong and how to handle this. Pasted the code below.
FlexiView.java (which extends StoredProcedure
)
package com.web.helper;
import java.util.List;
import java.util.Map;
import oracle.jdbc.OracleTypes;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.jdbc.core.SqlOutParameter;
import org.springframework.jdbc.core.SqlParameter;
import org.springframework.jdbc.object.StoredProcedure;
import com.domain.ViewFormDO;
public class FlexiView extends StoredProcedure {
public FlexiView(JdbcTemplate jdbcTemplate, final String storedProc, ViewFormDO viewFormDO, int reportId){
super(jdbcTemplate, storedProc);
jdbcTemplate.setFetchSize(20000);
declareParameter(new SqlOutParameter("P_CUR", OracleTypes.CURSOR, new FlexiViewMapper(reportId, viewFormDO)));
declareParameter(new SqlParameter("P_AGENT_ID", OracleTypes.NUMBER));
this.compile();
}
@SuppressWarnings("unchecked")
public <T> List<T> executeStoredProc(final Map<String, Object> valueMap) {
System.out.println(">>"+valueMap.size());
// execute stored procedure
Map<String, Object> resultMap = super.execute(valueMap);
return (List<T>)resultMap.get("P_CUR");
}
}
CustomMapper
package com.web.helper;
import java.sql.ResultSet;
import java.sql.SQLException;
import java.text.SimpleDateFormat;
import org.springframework.jdbc.core.RowMapper;
import com.domain.ViewFormDO;
import com.domain.ViewFormGridDO;
public class FlexiViewMapper implements RowMapper<Object> {
int reportId = 0;
final SimpleDateFormat dtFormat = new SimpleDateFormat("dd-MM-yyyy");
ViewFormDO viewFormDO;
public FlexiViewMapper(int reportId, ViewFormDO viewFormDO){
this.reportId = reportId;
this.viewFormDO = viewFormDO;
}
@Override
public Object mapRow(ResultSet rs, int rowNum) throws SQLException {
ViewFormGridDO flexiActiveGridDet = new ViewFormGridDO();
if (reportId == 1001 || reportId == 1002) {
// Get data from DB and assign it in ViewFormGridDO Bean
}
return flexiActiveGridDet;
}
}
And calling this in DAO, like this
@Override
public List<ViewFormGridDO> getFlexiGuideEnqGridDet1(final ViewFormDO viewFormDO)
throws NsureException {
List<ViewFormGridDO> flexiGuideGridDetails = null;
final int reportId = viewFormDO.getFlexiTypeId() == GlobalNames.SCHEME_XXX ? 1002 : viewFormDO.getReportId();
LinkedHashMap<String, Object> valueMap = new LinkedHashMap<String, Object>();
valueMap.put("P_AGENT_ID", viewFormDO.getAgentId());
FlexiView flexiView = new FlexiView(jdbcTemplate, "PKG_XXXXX.prGetXXXXX", viewFormDO, reportId);
flexiGridDetails = flexiView.executeStoredProc(valueMap);
return flexiGridDetails;
}
FINAL UPDATE:(a.k.a Solution?)
I ended up doing this. My Bean ViewFormDO had 100+ fields but I need only 20-30 fields. Due to this, the list of beans itself crossed 500MB mark easily. To avoid this, I've changed the row mapper to return JSONObject instead of ViewFormDO in FlexiViewMapper.
Still, generating excel with 7 lakhs records proved to be generating Heap Space error and for this, capped the limit at 5 lakh. Changed the code logic to eliminate unnecessary values and that brought down the data from 72 lakhs to 2 lakhs. Yes, a lot of useless combinations were generated. So after all these optimizations, now, even if I select ALL filters to generate report, the number of data is not crossing 5 Lakh mark and hoping that it will stay this way. We had given all possible scenarios!