I have 3 tasks that needs to be done on a relational database (SQL Server or Oracle). 1. Archive (copy a set of rows if status is S from Table1 to ArchiveTable1(archive status set to X)) 2. Delete (Mark ArchiveTable1 rows for delete i.e. update status from X to Y) 3. Purge (Mark ArchiveTable1 rows for purge i.e. update status from Y to Z)
(Delete just means deleting other references in related tables, but keep the row, purge will delete the row from the archive table)
We did this by writing code in an app server timer, for each of the above tasks to pick 1000 rows based on timestamp and specific status and do the process.
This works fine if there is a single app server. But if there are multiple app servers, then in Oracle the table gets locked(not just the rows) and some of the other processes doing the tasks time out or hang.
My question is Are there database design patterns out there to do such batch processing jobs? If so what is the best pattern and how do I get more details about it?