0

I have a batch processing system that aggregates the success counts. Many servers does the aggregation, all updating a single row in a table concurrently. I executing prepared statements using hibernate like this

SQL_UPDATE = "UPDATE STATS.COUNTS SET SUCCESSCOUNT = SUCCESSCOUNT + ? WHERE ID = ?"
update = session.connection().prepareStatement(SQL_UPDATE);
update.setLong(1, data.getSuccessCount());
update.setLong(2, data.getBatchId());
update.execute();

There were few concurrency issues and decided to use pessimistic locking since the chances of collision are pretty hight. How do I perform pessimistic locking when using prepared statements like this in hibernate.

zolo
  • 149
  • 2
  • 10
  • You're doing this via native SQL rather than via Hibernate so it cannot help you. If you carry out the query in a transaction do a `SELECT...FOR UPDATE` first that will lock as you want. – Boris the Spider Jan 02 '14 at 12:02
  • @BoristheSpider the above is being performed in a transaction itself. By select...for update u mean retrieving object first and then doing an update ? If I do the select...for update isn't it two queries hitting the db (1 for select & 1 for update) instead of 1 in this case ? – zolo Jan 02 '14 at 14:17

2 Answers2

1

You didn't mention setting the isolation level of your transactions. Have you tried setting it using something like @Transactional(isolation=Isolation.SERIALIZABLE, propagation=Propagation.REQUIRED)

Otherwise, Hibernate session lock methods work on objects and does not apply to SQL DDLs.

1

If you are concerned about concurrency, you can change the transaction isolation level from default to SERIALIZABLE. This is the highest level of isolation. Highest isolation can increase the chances for deadlock and hence performance issues.

Again if it's a batch update, better not to have high level of isolation. You may consider evaluating batch technologies like Spring Batch.

shatk
  • 465
  • 5
  • 16