0

I have a Custom Source DataFlow Component whose O/P will differ every time, I need to Insert those records in to a Destination table.

Problem:- Can't Specify the Input columns at Design time for Destination Component.. as in actual for every call to the DataFlow task, The source component gonna return Different O/P Columns based on the Table Schema.

Solution Needed For:- How to accept whatever inputs are available without any mapping in Destination DataFlow component(either by any Existing Component or by Custom Component)?

manas das
  • 64
  • 12
  • This sounds more like a procedural code developer's question than an ETL developers question. Are you sure that an ETL tool is the proper solution for your current scenario? – Raj More May 17 '16 at 13:58
  • Well i m done with this by individual mapping for individual tables .. just wanted to know is there any way we can do that..:P..months back i have started SSIS .. have to learn a lot..:P – manas das May 22 '16 at 12:51

3 Answers3

0

The data flow's fixed structure is there for data validation and to optimization purposes. All of it's components are going to have fixed input and output columns. I would suggest the following possibilities:

  • Write a data flow for every possible schema. There are probably a finite number of possibilities. You could reduce the effort of this task by using BIML which could generate the package structure for you. This may also introduce the possibility of parallel loading.
  • Use a script task instead of a data flow. In the script task, write the rows for each input into a table.

m

Mark Wojciechowicz
  • 4,287
  • 1
  • 17
  • 25
  • yeah .. sounds like i asked a dumb question.. ;P .. initially i thought of using a Script Task and bulk copy but the issue was with Performance. i had to compromise with that due to heavy volume of data..I was already done with the individual mapping, just wanted to know if anything kind of Dynamic mapping would be possible or not ... Thanks.. – manas das May 22 '16 at 12:52
0

If you need to pass multiple inputs to a single script component, the only way I know to do this is by passing the multiple inputs to a UNION ALL component, and then passing the single output from the UNION ALL to the Script.

You'll have to account for any differences between the two columns structures in the UNION ALL, and maybe use derived columns if you need an easy way to identify which original input a row came from.

Tab Alleman
  • 31,483
  • 7
  • 36
  • 52
0

I know this is way late but I keep seeing this UNION ALL approach and don't like it.

How about this approach.

  1. Run both data flows into their own recordset destination and save into a variable of type ADO object

  2. Create a new dataflow and use a script source and bring in both ADO objects

  3. Fill datatables using adapter and then do what ever you want with them.
KeithL
  • 5,348
  • 3
  • 19
  • 25