2

client uploads text file - contains alphanumeric ID in each line of a text file(single column). What I am doing is, in my controller class I am reading each line from text file and delimit them to comma separated values (100X,101Y,102Z,103T,104G,105V,106C,107W,108Q).

I am sending comma separated values as a single string to oracle procedure where I need to insert these values into global table(each value in different row). I am new to stored procedure so I have no idea how I should insert these values in global table. Or can I achieve my requirements using sql loader, how do we use sql loader? Can you guys please help/guide me on my issue. Please check code below for more info.

//just for a test

create global temporary table my_temp_table( //global  table
   id varchar2(30);
) 
on commit preserve rows;    

//this is my procedure
CREATE OR REPLACE PROCEDURE remove_emp (
employee_id IN CLOB //CONTAINS COMMA  SEPERATED ALPHANUMERIC VALUES ('100X,101Y,102Z,103T'..)
) AS
BEGIN
  INSERT INTO my_temp_table(id) VALUES( //not working for me, need help here
  employee_id)

enter image description here

Littlefoot
  • 131,892
  • 15
  • 35
  • 57
rabbit
  • 45
  • 5

1 Answers1

2

See if this helps.

Procedure splits comma-separated values to rows and inserts them into a table.

SQL> create or replace procedure remove_emp (par_empid in clob) is
  2  begin
  3    insert into my_temp_table (id)
  4    select regexp_substr(par_empid, '[^,]+', 1, level)
  5    from dual
  6    connect by level <= regexp_count(par_empid, ',') + 1;
  7  end;
  8  /

Procedure created.

Testing:

SQL> begin
  2    remove_emp('100X,101Y,102Z,103T,104G,105V,106C,107W,108Q');
  3  end;
  4  /

PL/SQL procedure successfully completed.

SQL> select * From my_temp_table;

ID
------------------------------
100X
101Y
102Z
103T
104G
105V
106C
107W
108Q

9 rows selected.

SQL>

However, if I were you, I'd skip it entirely. As you already have a file which contains rows of data, use

  • SQL*Loader or
  • external tables feature or
  • UTL_FILE package

to load such a data. Because, what you do now is to

  • convert rows to a comma-separated values loooong string
  • pass it to a procedure
  • split that string back into rows
  • insert them into a table

A lot of work, most of it in vain.

As of "new" options I suggested, SQL*Loader allows you to have the source file locally (on your PC), while other two options require the file to be located on the database server. Whichever option you choose, it will be faster than what you're doing now. Think about it.


SQL*Loader example:

Control file is simple; it presumes that file is located in my c:\temp directory and its name is data16.txt.

load data
infile 'c:\temp\data16.txt'
replace
into table my_temp_table
(
  id char(30)
)  

Table description:

SQL> desc my_temp_table;
 Name                                      Null?    Type
 ----------------------------------------- -------- ----------------------------
 ID                                                 VARCHAR2(30)

Loading session:

c:\Temp>sqlldr scott/tiger control=test16.ctl log=test16.log

SQL*Loader: Release 11.2.0.2.0 - Production on Pon Tra 6 12:44:34 2020

Copyright (c) 1982, 2009, Oracle and/or its affiliates.  All rights reserved.

Commit point reached - logical record count 8
Commit point reached - logical record count 9

Result:

c:\Temp>sqlplus scott/tiger

SQL*Plus: Release 11.2.0.2.0 Production on Pon Tra 6 12:44:42 2020

Copyright (c) 1982, 2014, Oracle.  All rights reserved.


Connected to:
Oracle Database 11g Express Edition Release 11.2.0.2.0 - 64bit Production

SQL> select * From my_temp_table;

ID
------------------------------
100X
101Y
102Z
103T
104G
105V
106C
107W
108Q

9 rows selected.

SQL>
Littlefoot
  • 131,892
  • 15
  • 35
  • 57