0

I have a screen where a user click a button to start a process and the backend process will update a set of records (around 20k - 60k ) with different values (wat ever it should be). The "Where" clause that I am using is indexed but the process is taking around 45 min to 90 min. Also, the table that I am updating has more than 50 million of records. Is there anything that I could do to improve this ?

Will partitioning the table help ?

chapter473
  • 15
  • 7
  • 2
    If you can, could you show us your query, explain plan and maybe provide some little explanation of your data structures? Maybe, you could also speed up your update execution partitioning it into some smaller parts for particular set of data fetched in infinitesimal periods of time and run it in loop in PL/SQL procedure? – AndrewMcCoist Jan 11 '16 at 19:51
  • 3
    Showing code would certainly be helpful. Are you updating the rows one by one, or in bulk? – Alex Poole Jan 11 '16 at 20:13
  • This is a stored procedure and it is updating the rows one by one as the value is different for each row. I can't paste the code here but I will try to post the query very similar to it. – chapter473 Jan 11 '16 at 23:08
  • 2
    You can update multiple rows at once with different values. Look into PL/SQL collections as a starting point, if your client layer doesn't support bulk updates directly. – Alex Poole Jan 11 '16 at 23:13
  • Can you show what your free % is on the table and columns? Your table might be too stingy with storage space when managing updates. – Hambone Jan 12 '16 at 03:10

0 Answers0