0

I'm loading a bunch of data from a file to a table in a database with SQL*Loader. My control file is this:

OPTIONS (DIRECT=FALSE)
LOAD DATA
INFILE *
BADFILE *                                                          
APPEND                                         
INTO TABLE TDE_MIDINERO            
(                                               
    ACTIVA  POSITION(1:20) CHAR(20),         
    TIPUSACT POSITION(21:21) CHAR(1),        
    IMPREINT POSITION(22:32) DECIMAL EXTERNAL(11) ":IMPREINT/100",
    NUMREINT POSITION(33:37) INTEGER EXTERNAL,
    REINTEGROS_MES1 POSITION(38:48) DECIMAL EXTERNAL(11) ":REINTEGROS_MES1/100",
    REINTEGROS_MES2 POSITION(49:59) DECIMAL EXTERNAL(11) ":REINTEGROS_MES2/100",
    REINTEGROS_MES3 POSITION(60:70) DECIMAL EXTERNAL(11) ":REINTEGROS_MES3/100",
    REINTEGROS_MES4 POSITION(71:81) DECIMAL EXTERNAL(11) ":REINTEGROS_MES4/100",
    REINTEGROS_MES5 POSITION(82:92) DECIMAL EXTERNAL(11) ":REINTEGROS_MES5/100",
    REINTEGROS_MES6 POSITION(93:103) DECIMAL EXTERNAL(11) ":REINTEGROS_MES6/100",
    REINTEGROS_MES7 POSITION(104:114) DECIMAL EXTERNAL(11) ":REINTEGROS_MES7/100",
    REINTEGROS_MES8 POSITION(115:125) DECIMAL EXTERNAL(11) ":REINTEGROS_MES8/100",
    REINTEGROS_MES9 POSITION(126:136) DECIMAL EXTERNAL(11) ":REINTEGROS_MES9/100",
    REINTEGROS_MES10 POSITION(137:147) DECIMAL EXTERNAL(11) ":REINTEGROS_MES10/100",
    REINTEGROS_MES11 POSITION(148:158) DECIMAL EXTERNAL(11) ":REINTEGROS_MES11/100",
    REINTEGROS_MES12 POSITION(159:169) DECIMAL EXTERNAL(11) ":REINTEGROS_MES12/100",
    ACUMULADO POSITION(170:182) DECIMAL EXTERNAL(13) ":ACUMULADO/100",
    IMPMAX POSITION(183:195) DECIMAL EXTERNAL(13) ":IMPMAX/100",
    IMPLIMIT POSITION(196:208) DECIMAL EXTERNAL(13) ":IMPLIMIT/100",
    DATULTM sysdate,
    CAJERO POSITION(209:213) INTEGER EXTERNAL,
    DATUDIA POSITION(214:222) INTEGER EXTERNAL,
    DATUMES POSITION(223:231) INTEGER EXTERNAL
) 

The field "Activa" is a primary key. My problem is that SQL*Loader is not preventing duplicate records with the same primary key.

How can avoid this? My table has lots of duplicate records and its index is corrupted. Is there some way to discard them (or put them in a bad file) before insert them in the table?

Thank you in advance

eloweyn
  • 123
  • 1
  • 4
  • 17

1 Answers1

1

If you create the primary key constraint on the table you're loading into, duplicate records after the first one for a given value of the key will be discarded into the bad file. Just set the ERRORS parameter to a high value so that the import doesn't stop after 50 errors.

Of course it's a crude way of handling duplicates: alternatively you can load all the records into a temp table and dedup them by using SQL queries.

Nicolas
  • 923
  • 7
  • 11