3

On adhoc basis, we want to copy contents from 4 of our Oracle production tables to QA/UAT environments. This is not a direct copy and we need to copy data based on some input criteria for filtering.

Earlier we were using Sybase database hence BCP utility worked with charm there. However, we have recently migrated to Oracle and need similar data copy requirement.

Based on the analyses till now, I have analyzed below options -

  • RMAN (Recovery Manager) - Cannot use as it does not allow us to copy selected tables or filtering on data.
  • SQLLDR (SQL Loader) – Cannot use this as we have BLOB columns and hence not sure how to create a CSV file for these BLOBS. Any suggesstions?
  • Oracle Data pump (Expdp/Imbdp) – Cannot use this as even though it allows copying selected tables it does not allow us to filter data using some query with joins (I know it allows to add query but it works only on single table). A workaround is to create temp tables with desired dataset and dmp them using EXPDP and IMPDP. Any suggesstions if I have missed anything in this approach?
  • Database Link – This is the best approach which seems possible in this use case. But needs to check if DBA will allow us to create links to/from PRD db.
  • SQL PLUS COPY - Cannot use this as it does not work with BLOB fields.

Can someone please advise on which should be the best approach w.r.t performance.

Swapnil Jaju
  • 432
  • 1
  • 6
  • 14
  • Hi Folks, The reason I have raised this question is because of the twist in my requirement having to copy BLOBs and also do kind of filtering before copying the data from source to target. Since all the above mentioned approaches have pros and cons I want to get idea if anyone had faced such situation before and has any solution for this. Kindly re-open the question or let me know if any concerns – Swapnil Jaju Feb 09 '17 at 03:03

2 Answers2

2

I would probably use a DATAPUMP format external table. So it would be something like

create table my_ext_tab
organization external
(
        type oracle_datapump
        default directory UNLOAD
        location( 'my_ext_tab.dmp' )
)
as
<my query>

You can then copy the file across to your other database, create the external table, and then insert into your new table via an insert, something like:

insert /*+ APPEND */ into my_tab
select * from my_ext_tab

You can also use parallelism to read and write the files

BobC
  • 4,208
  • 1
  • 12
  • 15
  • I think with above approach, the external table on target db will have to act as a staging table. I will have to then move the data from staging tables to final tables so that my web application can query on those final tables. I doubt if application can query directly on the external tables. What do you think? – Swapnil Jaju Feb 09 '17 at 03:07
  • @SwapnilJaju. Technically you can run queries against an external table; in fact that is really all we are doing when we load the final table; we are just doing an INSERT... AS SELECT. However, most applications I have work with will do just that; read from the external table into the final table, that will be used by your application. – BobC Feb 09 '17 at 07:38
1

Taking all your constraints into account, it looks like Database links is the best option. You can create views for your queries with joins and filters on the PROD environment and select from these views through the db links. That way, the filtering is done before the transfer over the network and not after, on the target side.

JeromeFr
  • 1,869
  • 2
  • 17
  • 20
  • The issue with dblinks is that they serialize. – BobC Feb 08 '17 at 10:39
  • Yes indeed. It might not be a problem if it's on-demand batch transfer. But your answer is a better performance-wise, I didn't think of that solution. – JeromeFr Feb 08 '17 at 10:45
  • @BobC : Do you have any idea if BLOBs can be sent over to target DB via DBLinks? I read somewhere that it cannot be done. Kindly share any suggesstions around that too. Thanks – Swapnil Jaju Feb 09 '17 at 02:58
  • @SwapnilJaju. It's pretty easy to test and see that indeed you cannot: SQL> create table junk3( id number, col2 blob ); Table created. SQL> select * from junk3@mylink; ERROR: ORA-22992: cannot use LOB locators selected from remote tables – BobC Feb 09 '17 at 07:41
  • @BobC : Any workaround that can be used to get it done? I read somewhere that we need to use Temporary table for this. No idea how it can be implemented – Swapnil Jaju Feb 09 '17 at 09:42
  • @SwapnilJaju. Not that I can think of. But why not use the Datapump external table approach outlined in my answer? – BobC Feb 09 '17 at 09:51