0

I am having quite a large volume of data in my Oracle table. Through Oracle Pro*C code, I am trying to use host arrays for managing merge into DB table as per following example:

merge into TBL_NM A 
using (select 1 from dual) B 
on (col1 = to_date(:v,'YYMMDD') and col2 = :v and col3 = :v and col4 = :v) 
when matched then 
update set col5 = :v, col2 = col2+ :v, col3 = col3 + :v 
when not matched then  
insert (col1, col2, col3, col4, col5, col6, col7, col8) 
values (to_date(:v,'YYMMDD'),:v,:v,:v,:v,:v,:v,:v)

However, I am experiencing degrading performance when volume increase. It's taking too long a time and I need to optimize the merge for performance.

Will it be any better to use the Parallel DML as well as APPEND hint with the MERGE (upsert) SQL? I was reading an article related to insert tuning as Oracle insert tuning (http://www.dba-oracle.com/t_insert_tuning.htm). Similarly, can I tune the MERGE statement as below:

alter session enable parallel dml;
merge /*+ parallel(contract_dim,10) append */
into TBL_NM A 
using (select 1 from dual) B 
on (col1 = to_date(:v,'YYMMDD') and col2 = :v and col3 = :v and col4 = :v) 
when matched then 
update set col5 = :v, col2 = col2+ :v, col3 = col3 + :v 
when not matched then  
insert (col1, col2, col3, col4, col5, col6, col7, col8) 
values (to_date(:v,'YYMMDD'),:v,:v,:v,:v,:v,:v,:v)

Please share your views with explanations and any other suggestion for tuning the MERGE,

Dr. Debasish Jana
  • 6,980
  • 4
  • 30
  • 69
  • This looks Lille the same question you asked in another thread. I asked you for some numbers regarding data volume in that thread, but you did not answer them there; perhaps you could answer them here? – BobC Feb 14 '17 at 08:41
  • Given that you want to throw parallelism at the problem, how many CPU cores does your system have? – BobC Feb 14 '17 at 08:45
  • @BobC With about 18 Million data being passed through upsert/merge (about 30% update and 70% insert), it is taking approx 12 hours and situation is degrading with further attempt of merge. Size of the host array is 40K at a time, for more data, we execute MERGE multiple times each time with max 40K data passed through host arrays. We have 4 CPU cores in test area and 8 CPU cores with Solaris 10 in production – Dr. Debasish Jana Feb 14 '17 at 08:59
  • With about 18 Million records occupying approx 8GB space to handle merge – Dr. Debasish Jana Feb 14 '17 at 09:05
  • What is the source of these 18M rows? is it a set of files, for example? – BobC Feb 14 '17 at 11:58
  • @BobC processed data is the source – Dr. Debasish Jana Feb 14 '17 at 13:07
  • Let us [continue this discussion in chat](http://chat.stackoverflow.com/rooms/135683/discussion-between-bobc-and-dr-debasish-jana). – BobC Feb 14 '17 at 13:15

0 Answers0