Objective: Load data from flat files which has different headers to it's each corresponding table.The files can be differentiated by it's filename as it contains the name of the table.
What I've tried:
Used a foreach loop container to loop multiple files, used a variable to dynamically set the name of the incoming file to the flat file source and also did the same for the ole db destination by selecting the option to set the table name from the variable.
- This didn't work because the metadata does not get updated for old db destination. Only works if the structure is same.
My options:
Use script task and control flow condition coupled with Data Flow task for each file. That way each data flow will have a flat file source and ole db destination. (Not sure if this is a viable option if there more than 10 files as number of data flow task will be equal to number of files.)
Use script task to take care of loading the data into individual tasks. (Not sure the streamreader will be as fast as data flow task).
Is there any other way to accomplish this apart from the above two options or dynamically create package for each data load.?