I am extracting thousands of rows from one database table and inserting into another database table. I don't want to load all the records into memory and then insert into the other database.
Because of that, I am trying to use a BlockingQueue to load a the extractor results using one thread and insert into another database concurrently using another thread. I am using Spring JdbcTemplate to access my database.
Here is my plan
public void performExtractionInsertion(JdbcTemplate inboundJdbcTemplate, JdbcTemplate outboundJdbcTemplate){
final BlockingQueue queue = new LinkedBlockingQueue<Transaction>(50);
ExecutorService executor = Executors.newFixedThreadPool(2);
final String SELECT_QUERY = "SELECT acc_number, date, type FROM transactions";
final String INSERT_QUERY = "INSERT INTO analysis(col1, col2, col3) VALUES(?,?,?)";
executor.execute(new Runnable() {
@Override
public void run() {
queue.put(/*IMPLEMENTATION OF EXTRACTOR USING inboundJdbcTemplate*/);
}
});
executor.execute(new Runnable() {
@Override
public void run() {
queue.take(/*IMPLEMENTATION OF INSERTER USING outboundJdbcTemplate*/)
}
});
}
Could someone give me an idea of how to implement the EXTRACTOR and INSERTER so that they use the same BlockingQueue to limit the number of rows in memory?
Is this the right approach? Can I still use jdbcTemplate? What is the most intelligent and convenient way to do this?
Thanks guys
BTW, Transaction is the class of the object that is going to hold the extracted elements to be inserted.