I have a Spring Boot application where I batch insert hundreds of thousands of rows across two tables where I insert one row in one of the tables, return the id and use it to insert one row in the other table. As it is right now it's simply not fast enough. I would like to know what is the fastest way to do it. Here is my code:
private void insertData(DataList request, long personId, String tableName) {
try {
String sql = """
WITH inserted AS
(INSERT INTO user_%s (username, password, email, phone, person_id) VALUES (?,?,?,?,?) RETURNING user_id)
INSERT INTO info_%s (user_id, info, full_text_search) VALUES ((SELECT user_id FROM inserted), ?, to_tsvector(?));
""".formatted(tableName, tableName);
jdbcTemplate.batchUpdate(sql, new BatchPreparedStatementSetter() {
@Override
public void setValues(@Nonnull PreparedStatement ps, int i) throws SQLException {
Data data = request.getData(i);
ps.setTimestamp(1, data.getUsername());
ps.setLong(2, data.getPassword());
ps.setInt(3, data.getEmail());
ps.setInt(4, data.getPhone());
ps.setString(5, data.getpersonId());
ps.setString(6, data.getInfo());
ps.setString(7, data.getInfo());
}
@Override
public int getBatchSize() {
return request.getDataCount();
}
});
} catch (Exception e) {
log.error(e.getMessage(), e);
}
}
I'm using PostgreSQL as my database.