-1

Objective: Load data from flat files which has different headers to it's each corresponding table.The files can be differentiated by it's filename as it contains the name of the table.

What I've tried:

Used a foreach loop container to loop multiple files, used a variable to dynamically set the name of the incoming file to the flat file source and also did the same for the ole db destination by selecting the option to set the table name from the variable.

  • This didn't work because the metadata does not get updated for old db destination. Only works if the structure is same.

My options:

  1. Use script task and control flow condition coupled with Data Flow task for each file. That way each data flow will have a flat file source and ole db destination. (Not sure if this is a viable option if there more than 10 files as number of data flow task will be equal to number of files.)

  2. Use script task to take care of loading the data into individual tasks. (Not sure the streamreader will be as fast as data flow task).

Is there any other way to accomplish this apart from the above two options or dynamically create package for each data load.?

kurozakura
  • 2,339
  • 8
  • 28
  • 41
  • You can't change the metadata in a data flow task at run-time, so you need a flat file source and OLE DB destination for each set of metadata. – Tarzan Oct 27 '16 at 21:01

1 Answers1

0

You can look into BiML, which dynamically creates and executes packages based on meta data about your source and destination.

But personally, I would use your option 1. 10 files (and 10 corresponding dataflows) is not enough to make me want to complicate things. Now if you had around 50+ files, then I'd probably want to look into a BiML or Script solution.

Tab Alleman
  • 31,483
  • 7
  • 36
  • 52