0

My application Fusion BICC is dumping data into oracle cloud object storage in the form of csv. I need to upload this data in my target database. so I am uploading data in external table and then comparing data of external table and target table using Minus and if data is new I am inserting it and if it exist I am updating them. I need few suggestion.

1) what is the best way to compare record if there is huge data.

2) instead of writing to external table is there any other better way? sqlloader, utl_file etc

3) If any record got deleted in BICC it does not come into csv file. but I have to delete those record if they are not in file. how to tackle that.

other than DBMS_CLOUD is there any package to upload data. I am very new to this. Request you to please suggest me on the same.

Consider BICC is an application which is dumping data in the form of cs file to Oracle cloud. I am interested basically in reading data from cloud storage to DBaaS.

  • 1
    External tables are fast and handy, if you can manage to read them from OSS, I see no problems with your strategy. DBMS_CLOUD was made to do just this... although you can also use sqldeveloper to read from OSS to import to a new or existing table in your cloud DB. – thatjeffsmith Apr 11 '20 at 19:53
  • 1
    thanks very much @thatjeffsmith –  Apr 11 '20 at 20:29

0 Answers0