0

I have a data flow in SSIS that's using an ODBC Source to a conditional split.

The source returns a dynamic set of columns dependent on availability of data in the source - the number of columns goes from 1 to 13.

In my conditional split I have it pointing at the source and feeding the data to a destination that fits its number of columns.

Example:

Condition 1 -> Map column 1 to column 1 and ignore the other 12 columns Condition 2 -> Map column 1 and 2 to column 1 and 2 and ignore the other 11 columns

However, if the source only contains 1 column it fails on the second condition because "there are some mapping errors on this path"

I know that the count of columns will never exceed 13 which means I can set conditions for columns 1 - 13.

Is there any way that I can ignore the mapping error or force SSIS to stop at the last executable case in my conditional split?

I don't personally want to have to dive into a script component so if this can be done with conditional split I'd be relieved!

Any thoughts?

Dale K
  • 25,246
  • 15
  • 42
  • 71
Channing
  • 129
  • 2
  • 12
  • 1
    SSIS expects a determined definition for your sources and destinations. If wither doesn't have one, you can't use a data flow. That is by design. If you don't have defined data definitions, that is the *real* problem, in my opinion. Needing a "dynamic" approach for your data normally implies a design flaw. – Thom A Apr 16 '20 at 20:47
  • A real problem that I have to face given the ODBC translates a query without any specified columns - the return will always be dynamic dependent on the available data. You could have just suggested a way to create data definitions dynamically (which is what I'm essentially trying to do with the conditional splits) or pointed me to a resource. Thanks. – Channing Apr 16 '20 at 20:56
  • I can't point you an alternative when the alternative within a data flow doesn't exist, @Channing . Like I said, it's by design. You would to likely code the solution yourself using a Script Task, which means no condition split, or other transformations, as they don't exist in a Script task; it's part of the control flow not the data flow. – Thom A Apr 16 '20 at 20:59

1 Answers1

2

As Larnu indicates, the number of columns in a data flow is a design time artifact and cannot be changed at run-time.

But, you should be able to handle this with 12 data flows.

Execute SQL Task -> However your current ODBC source is generating a variable set of columns, determine how many are being returned. Assign this to an SSIS Variable @[User::ColumnCount]

Attach 12 output paths from the Execute SQL Task to custom Data Flow Tasks that account for the number of source columns.

Change the precedence constraint on each of the paths to be Constraint and Expression with expressions like @[User::ColumnCount]==1 ... ==13

The SSIS designer is going to try to validate metadata as you design the package. As will the execution engine when you run the package. Therefore, you'll need to set the Delay Validation property to True on each of the Data Flow Tasks after you finish designing them.

In fact, as I think about this more, you'd like be better served by a parent/child package paradigm here. Design a package per data flow task and then have the parent/controller package invoke them much as I described above. That should simplify the metadata validation challenges you'll experience trying to get this built.

billinkc
  • 59,250
  • 9
  • 102
  • 159
  • This was the perfect solution! I ended up using the parent/child paradigm here to bypass any meta data validation. I know that not having defined sources is a problem as Larnu said but since this is the only ODBC driver compatible with the source system it's my only choice. I created the Controller and used precedent constraints to manage the other 13 packages. Works like a charm. Thank you! – Channing Apr 17 '20 at 20:00