0

I have almost the exact same issue as the scenario (linked) below, but unfortunately i'm unable to recreate the solutions succesfully.

I have a c# application using SQL Bulk Import with a datareader and writetoserver, where it's the SQLDatReader or an OracleDataReader, and i need to add columns to the result set.

I can not do it on the source sql statement. I can not load a data table first and modify it (as it's 100's of gb's of data, almost a terabyte).

How to add columns to DataReader

can anyone provide a working example an help "push" me over this problem?

I temporarily found a solution of using SQL Server Integration Services (SSIS), but what i found while watching it run is it downloads all the data to a dts_buffer, than does the column modifications and then pumps the data into sql server, try doing that with a couple 100gb of data and it is not a good performing thing, if you can even get your infrastructure to build you a 24 core VM with 128gb of memory).

Robert
  • 1

1 Answers1

0

I finally have a small working example, the codeproject (jdweng) was helpful. I will pose followup, i've tested with sql server (sqldatareader), need to do a test with oracle data reader.

One of the cases i was trying was converting a oracle unique id (stored as a string) to sql server as a uniqueidentifier. I want convert that on the fly, there is no way to adjust the source oracle statement (ADyson) to return a compatible datatype to sql server. Altering a 1tb table afterwards from varchar(40) to uniqueidentifier is painful, but if i could just change as part of the bulk insert, it'd be quick.

and i think now i will be able to.

Robert
  • 1