0

I am facing a challenge when I try to load more tha 6 million of records. I am getting the below exception after 3 million records.

The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020.

The input source is a text file and destination table is oracle. Do I have to do any configuration.enter image description hereenter image description here

[OLE_DST Oracle archive staging table [16]] Error: SSIS Error Code DTS_E_OLEDBERROR.  An OLE DB error has occurred. Error code: 0x80004005.
An OLE DB record is available.  Source: "OraOLEDB"  Hresult: 0x80004005  Description: "ROW-00060: Internal error: [dainsert,16]".

[OLE_DST Oracle archive staging table [16]] Error: SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR.  The "OLE_DST Oracle archive staging table.Inputs[OLE DB Destination Input]" failed because error code 0xC020907B occurred, and the error row disposition on "OLE_DST Oracle archive staging table.Inputs[OLE DB Destination Input]" specifies failure on error. An error occurred on the specified object of the specified component.  There may be error messages posted before this with more information about the failure.

[SSIS.Pipeline] Error: SSIS Error Code DTS_E_PROCESSINPUTFAILED.  The ProcessInput method on component "OLE_DST Oracle archive staging table" (16) failed with error code 0xC0209029 while processing input "OLE DB Destination Input" (29). The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.  There may be error messages posted before this with more information about the failure.

[FF_SRC Archive file data [2]] Error: The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020.

[SSIS.Pipeline] Error: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED.  The PrimeOutput method on FF_SRC Archive file data returned error code 0xC02020C4.  The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.  There may be error messages posted before this with more information about the failure.
user416
  • 516
  • 3
  • 12
  • 34
  • @billinkc , Thanks for the reply.. I have added the error message. – user416 Aug 15 '16 at 15:36
  • 1
    You have data values in your flat file that do not align with the allowable range in Oracle. Whether that means you're putting a NULL in a non-nullable, you've got a number that exceeds the min/max value or just a bad value, it's impossible to say. I'd take a [similar approach](https://social.msdn.microsoft.com/Forums/sqlserver/en-US/baddced5-75f4-49e8-8239-e3247caa61b1/row00060-internal-error-dainsert16?forum=sqlintegrationservices) and remove columns until you get a successful run and then slowly bring them back in. – billinkc Aug 15 '16 at 15:45
  • Thanks.. I am using single column for data transfer. I have disabled the remaining columns and i didn't see any bad data in the particular error record.Planning to breakdown the single file in to many and then test it.Any advice on this. My default buzzer size is 3 time higher than the input records. – user416 Aug 15 '16 at 16:40
  • Seems like a 'limitation' of Oracle OLEDB connector - I faced similar issue in the past, and find a similar thread on [MSDN Forum](https://social.msdn.microsoft.com/Forums/sqlserver/en-US/5851fbbe-9d31-4665-a403-01f018684839/data-flow-task-failed-after-loading-5245857-rows-oracle-db-source-and-destination?forum=sqlintegrationservices). Migration to Attunity drivers helped me too. – Ferdipux Aug 15 '16 at 16:49
  • Alternative to Attunity connector - split data to chunks and load it in several DataFlow tasks instead of one. – Ferdipux Aug 15 '16 at 16:50
  • I agree with @Ferdipux you can split the data into chuncks, but might also be able to do it in the same dataflow task by usign a conditional split and splitting your 1 input source to multiple streams and then repeating your destination a few times – Matt Aug 15 '16 at 18:09
  • @Ferdiplux, I have installed the Attunity connector (Microsoft Connectors v4.0 for Oracle and Teradata ) and tried to execute the container in VS2015 professional edition. But I am getting the below error. Any help on this. Exception deserializing the package "The package failed to load due to error 0xC0011008 "Error loading from XML. No further detailed error information can be specified for this problem because no Events object was passed where detailed error information can be stored.". This occurs when CPackage::LoadFromXML fails – user416 Aug 30 '16 at 19:23

0 Answers0