0

I am trying to insert data from a text field which is of varchar2(8000) on sql server.I have created a similar column but of clob datatype on oracle db. can you please help me resolve the inconsistent datatypes issue and improve the performance of the query? PLease let me know if there is any other better way to insert huge text files for multiple rows.When I run the same query on sql server and oracle db(i.e select "Note" from TBL_NOTES@PPRLEG) the results are quickly retrieved. when I try to insert the data it takes a lot of time and it doesn't end.i should only give here the column that contain the clobs, and it should handle the rest.

CREATE OR REPLACE FUNCTION get_clob_data
(id in TBL_NOTES.id%TYPE,
 REPORT_ID in TBL_NOTES.REPORT_ID%type,
 UNIQUE_ID  in TBL_NOTES.UNIQUE_ID%type,
 USER_ID in TBL_NOTES.USER_ID%type,
 DTS in TBL_NOTES.USERROLE%type,
 USEREIN in TBL_NOTES.USERROLE%type 
  )
 RETURN varchar2
  IS
  l_text varchar2(4000);
  Type t_note_txt IS TABLE OF TBL_NOTES.NOTE%type;
  v_notes t_note_txt;
 BEGIN
 SELECT ltrim(rtrim("NOTE"))NOTE
 bulk collect into v_notes
 FROM TBL_NOTES@PPRLEG;
  for indx in 1..v_notes.count
   loop
    l_text := l_text || ' ' ||v_notes(indx);
  end loop;

RETURN l_text;
END;

/

  • afaik sql server doesn't have a varchar2 data type. Why are you using a function here? How does this function even work, you are limiting output to 4000 characters but selecting the entire remote table? It would be easier to help if you would give more details. the definition of the remote table, number of rows, what you are trying to accomplish. – Robert Dupuy Sep 27 '15 at 09:11
  • We are trying to migrate sql server data 2008 R2 to Oracle 11 g database.We are trying to insert the rows(count-379012) using dblink, but fields with large text at sql server, it has a performance issue.We created CLOB datatypes for one such table (TBL_NOTES) and we are trying to perform a bulk collect operation here.the field "Note" at sql server is of text datatype and the field at Oracle end is CLOB datatype.I am very new to this migration and hence looking out for a fix to this.Please note that we have very large text files and multiple clob datatypes. – user3296391 Sep 27 '15 at 13:25
  • TBL_NOTES - NOTE - text datatype at sql server. TBL_NOTES- NOTE - CLOB datatype at Oracle.379012 is the count of the rows in sql server.Definition of the remote table - ID-int,REPORT_ID- int,UNIQUE_ID- varchar,user_id -int,NOTE-text,DTS-datetime,SalesNote- text,userrole-varchar,userein- varchar – user3296391 Sep 27 '15 at 13:27
  • Have you tried [Migrate to Oracle Database with SQL Developer](http://www.oracle.com/technetwork/database/migration/index-084442.html) ? – user272735 Sep 28 '15 at 08:33

2 Answers2

0

Since no one else has answered the question yet, I'm going to start the conversation.

But keep in mind, I don't have a remote SQL Server with text field to test on today.

What I note about your original answer, is you go to the remote database inside a function - and that concerns me, because retrieving the entire table repeatedly inside a function would be very costly in terms of time.

My preference is to take the data back from the remote SQL Server database in one call.

So, you could create a table on the oracle side:

CREATE GLOBAL TEMPORARY TABLE NOTES_STG (
ID int,
REPORT_ID int,
USER_ID int,
NOTE clob,
DTS date,
SalesNote clob,
userrole varchar2(50),
userin varchar2(50)
);

and then:

INSERT INTO NOTES_STG SELECT ID, REPORT_ID, USER_ID, NOTE, DETS, SAlesNote, USERROLE, USERIN FROM TBL_NOTES@PPRLEG;

These are examples, you may need to tweak the definitions, I don't know the size of the userin varchar column, for example.

In this case, I'm just trying to move the conversation forward.

But I would want to see, is taking the whole table across with one select statement going to work fast enough? Or is it still too slow?

Also note, I used in this example a global temporary table, obviously that would not work if it exhausts your PGA and TEMP tablespace, so depending upon what you have available, you may choose not to use a GTT.

I have, in the past, not been happy with the performance of an ODBC connection to SQL Server and done more sophisticated workarounds, but I can't say I have any specific workaround that I use as a general rule.

Robert Dupuy
  • 857
  • 5
  • 10
  • Thank you for your reply. I created the Global temporary table and when I ran the insert statement, it throws the below error : SQL Error: ORA-00997: illegal use of LONG datatype 00997. 00000 - "illegal use of LONG datatype" Mean while I ll check the performance of the query – user3296391 Sep 28 '15 at 16:23
0

Thank you for your reply. I created the Global temporary table and when I ran the insert statement, it throws the below error : SQL Error: ORA-00997: illegal use of LONG datatype 00997. 00000 - "illegal use of LONG datatype"

Mean while I ll check the performance of the query

user3929655
  • 13
  • 1
  • 6
  • insert into NOTES_STG (id,note,dts,report_id,unique_id,userin)select "ID",("NOTE"),"DTS","REPORT_ID","UNIQUE_ID","UserEIN" from tbl_notes@pprleg; This is the insert statement. Can we load the datausing sql loader. If so how can we do that? – user3296391 Sep 29 '15 at 03:14