5

I'm using SQL loader to load my data into database.

Before I insert the data I need to remove existing data in the table:

options(skip=1,load=250000,errors=0,ROWS=30000,BINDSIZE=10485760)
load data
infile 'G:1.csv' "str '^_^'"
replace
into table IMPORT_ABC
fields terminated by "," OPTIONALLY ENCLOSED BY '"'
trailing nullcols(
.
.
.
.)

But I got error like:

SQL*LOADER-926: OCI error while executing delete/truncate for table IMPORT_ABC ORA-30036: unable to extend segment by 8 in undo tablespace 'undo1'

How can I delete data for example by 10000 rows? I know that I have some limit on my DB.

4est
  • 3,010
  • 6
  • 41
  • 63
  • 3
    If it doesn't need to be recoverable - presumably not if you're replacing anyway - is there a reason you're using [`replace`](http://docs.oracle.com/cd/E11882_01/server.112/e22490/ldr_control_file.htm#sthref660) rather than [`truncate`](http://docs.oracle.com/cd/E11882_01/server.112/e22490/ldr_control_file.htm#sthref669)? – Alex Poole Apr 08 '13 at 12:11
  • no, I just want to delete all the items from the table and then instert a new one – 4est Apr 08 '13 at 12:56

2 Answers2

3

Deleting records in batches can be done in a PL/SQL loop, but is generally considered bad practice as the entire delete should normally be considered as a single transaction; and that can't be done from within the SQL*Loader control file. Your DBA should size the UNDO space to accommodate the work you need to do.

If you're deleting the entire table you'll almost certainly be better off truncating anyway, either in the control file:

options(skip=1,load=250000,errors=0,ROWS=30000,BINDSIZE=10485760)
load data
infile 'G:1.csv' "str '^_^'"
truncate
into table IMPORT_ABC
...

Or as a separate truncate statement in SQL*Plus/SQL Developer/some other client before you start the load:

truncate table import_abc;

The disadvantage is that your table will appear empty to other users while the new rows are being loaded, but if it's a dedicated import area (guessing from the name) that may not matter anyway.

If your UNDO is really that small then you may have to run multiple loads, in which case - probably obviously - you need to make sure you only have the truncate in the control file for the first one (or use the separate truncate statement), and have append instead in subsequent control files as you noted in comments.

You might also want to consider external tables if you're using this data as a base to populate something else, as there is no UNDO overhead on replacing the external data source. You'll probably need to talk to your DBA about setting that up and giving you the necessary directory permissions.

Alex Poole
  • 183,384
  • 11
  • 179
  • 318
  • can I do it like this: fist sql*loader file: `code options(skip=1,load=250000,errors=0,ROWS=30000,BINDSIZE=10485760) load data infile 'G:\Remediation metrics\Source data\Data Backup & Recovery\ALL_admin_combined_report_v1.csv' "str '^_^'" truncate into table IMPORT_ABC..... 2nd sql*loader file: options(skip=250001,load=250000,errors=0,ROWS=30000,BINDSIZE=10485760) load data infile '2.csv' "str '^_^'" append into table .. 3rd sql*loader: options(skip=500001,load=250000,errors=0,ROWS=30000,BINDSIZE=10485760) load data infile '1.csv' "str '^_^'" append into table – 4est Apr 08 '13 at 14:11
  • @user2207426 - yes, that's what I meant. Added a reference to the answer; thanks. – Alex Poole Apr 08 '13 at 14:27
  • @AlexPoole: Thx! It would work, but my user will not be granted truncate permissions. I have only delete privs. For now I use a separate sqlplus statement to delete the data and than sqlldr like above... But I'm not amused... – Sauer Nov 03 '17 at 13:40
0

Your undo tablespace is to small to hold all the undo information and it seems it cannot be extended.

You can split the import into smaller batched and issue a commit after each batch or get your DBA to increase the tablespace for undo1

And use truncate in stead of replace before you start the immports

Sibster
  • 3,081
  • 21
  • 18
  • 1
    The `replace` will attempt to delete everything in the table in the first run, however many records are being inserted; and subsequent runs with the same parameter would delete the rows you just inserted. Batching might be needed for the insert phase though... – Alex Poole Apr 08 '13 at 12:15
  • yes, I have data import split into 3 config files and it's ok but I can't remove all the items (in DB) by once, somehow I need to delete them by 1000rows etc how can I do it ? – 4est Apr 08 '13 at 12:36
  • Did you try truncate ? or you could try delete based on rowid http://asktom.oracle.com/pls/apex/f?p=100:11:0::::P11_QUESTION_ID:2345591157689 – Sibster Apr 08 '13 at 13:12