0

I want to insert 10 million rows into my Oracle Database via Database Link. What will be an optimized way to do that?

Doing INSERT by SELECT * FROM [dblink_table_name] would be an optimized way of doing it?

Imran Hemani
  • 599
  • 3
  • 12
  • 27
  • 1
    Try avoiding the need to write this much data through a db link. If the database you connect to is Oracle, Why not use [SQL*Loader](https://docs.oracle.com/cd/B28359_01/server.111/b28319/ldr_concepts.htm#g1013706) instead by connecting to the remote db? – Maheswaran Ravisankar Nov 23 '14 at 21:32
  • Why out of interest @Maheswaran? Whatever happens you're going to have to do the disk I/O on both databases and use up the network bandwidth in order to transfer the data? I'm not sure how a DB link works (assuming Oracle Net stack, which means OCI) but as long as it pulls/pushes data in the same chunksize as, say, moving a file, then it'll be as efficient as using a file but quicker as you perform your I/O on both databases simultaneously. – Ben Nov 23 '14 at 21:57
  • @Ben ,Well, Just my usual practice, ofcourse interest over loader. In UNIX I would use a pipe to transfer larger files. Also, A DIRECT path loading is an added advantage. And without digging into much technical details, I always see db link for SELECT. May be I made up my mind, to use SQL*Loaders. We would be using NAS drives at OS, so, remote loading has always been a successful activity and DB friendly, when I simply have atleast 100+ parallel loading. I also admit, with nextgen exadata, whatever we do is faster, like you mentioned. – Maheswaran Ravisankar Nov 23 '14 at 22:42

0 Answers0