I am trying to migrate data from an Oracle to SQLServer database. I'm currently using a JDBCCursorItemReader to read the Oracle data, then a JDBCBatchItemWriter to write the data to SQLserver.
My problem is that this is taking way too long. For a table of ~200,000 rows, it takes almost an hour (and I have to run four of these queries, each around 200K of rows).
@Bean
public JdbcCursorItemReader<DataPOJO> dataReader() throws Exception, ParseException, UnexpectedInputException {
final JdbcCursorItemReader<DataPOJO> dataReader= new JdbcCursorItemReader<>();
dataReader.setDataSource(oracleDataSource);
dataReader.setSql(Constants.DATA_QUERY);
dataReader.setRowMapper(new BeanPropertyRowMapper<DataPOJO>(DataPOJO.class));
return dataReader;
}
@Bean
public JdbcBatchItemWriter<DataPOJO> dataWriter() throws UnexpectedInputException, ParseException, Exception {
JdbcBatchItemWriter<DataPOJO> dataWriter = new JdbcBatchItemWriter<>();
dataWriter.setItemSqlParameterSourceProvider(new BeanPropertyItemSqlParameterSourceProvider<DataPOJO>());
dataWriter.setSql(Constants.DATA_MERGE);
dataWriter.setDataSource(mssDataSource);
return dataWriter;
}
Does anyone have any tips for improving performance on these kinds of jobs?