0

I have a table 'A' having 40 columns. I need to copy the data from 20 specific columns from 'A' , to another table 'B' having those 20 columns. There will be around 3 - 10 million records. What will be the most efficient way to do this in PLSQL.

cool_mo
  • 15
  • 1
  • 3

2 Answers2

1

"daily table B will be truncated and new data will be inserted into it from A."

Okay, so the most efficient way to do this is not to do it. Use a materialized view instead; a materialized view log on table A will allow you to capture incremental changes and apply them daily, or at any other window you like. Find out more.

Compared to that approach using handrolled PL/SQL - or even pure SQL - is laughably inefficient.

APC
  • 144,005
  • 19
  • 170
  • 281
0

Do you need to do any sort of conversion on the data or is it just copying data straight from one table to another?

The easiest way to do this is, although you would have to create the indexes separately.

create table B as (select A.c1, A.c2, A.c3..... from A);

If table x already existed, you could just do a

insert into B select A.c1, A.c2.... from A

To speed this up, you would want to drop all the indexes on table x until the insert was done.

Brian Hoover
  • 7,861
  • 2
  • 28
  • 41
  • No , there is no sorting of data involved, just need to copy the data to B. Daily Table 'B' will be truncated and new data will be added. So which will be the most efficient way 1) Insert into B select OR 2) BULK COLLECT – cool_mo Jul 09 '13 at 16:27
  • I haven't ever done a performance test. Is this a one time job? – Brian Hoover Jul 09 '13 at 19:37
  • i tried with INSERT INTO , but what will i have to do if i have to do a commit after every 50000 row inserted? – cool_mo Jul 09 '13 at 19:46
  • No. daily table B will be truncated and new data will be inserted into it from A. – cool_mo Jul 09 '13 at 19:48
  • Then, you should probably do a test run. Shouldn't take too long to do. – Brian Hoover Jul 09 '13 at 19:49
  • Yes will test it , but i need to do a commit after every 50000. Is that possible ? – cool_mo Jul 09 '13 at 19:51
  • As far as I know, you would have to manually separate out the records in an insert into select from, so if this is a requirement you'll probably be better off with the stored procedure. If you are doing this daily, you might want to thing about using a trigger on each insert/update to just do this after each new record. – Brian Hoover Jul 09 '13 at 19:58
  • @cool_mo - why do you (think you) need to commit every 50000 rows? – Alex Poole Jul 09 '13 at 22:06
  • @ Alex Poole Its actually a business requirement – cool_mo Jul 23 '13 at 09:29