-1

This seems ridiculously easy, but I can't find it anywhere...

I have a VERY simple sequence container with two tasks: Truncate a SQL table, and repopulate it from production. But this container will be repeated for about 50 tables. The container's name (entered manually) = the name of both the source and destination tables.

Sequence Container

I have two variables:

Variables

"TableName" is entered manually. "DelTable" is an expression that uses @[User::TableName] to generate a simple SQL statement.

I'm super-lazy and would like to use an expression to set "TableName" = the name of the current scope so I only have to enter it once.

Ideas???

THANK YOU!

user1916006
  • 31
  • 1
  • 4
  • What exactly are you calling "the current scope"? The name of the sequence container? – Tab Alleman Aug 20 '18 at 13:16
  • 1
    You could use a task "Expression" with an expression like `@[User::TableName]="OneOfMyTables"` Would this help in createing the 50 sequence containes? Wouldn't it be more helpful to throw the 50 table names into an ADO recordset variable and use one "Foreach Loop Container" instead of 50 sequence containers? If you are able to parameterize the 50 tasks, try it that way. – Wolfgang Kais Aug 20 '18 at 14:37
  • @WolfgangKais that's pretty much how I answered this question – KeithL Aug 20 '18 at 19:07
  • @KeithL So you described in more detail what I just pointed to? Geat! – Wolfgang Kais Aug 20 '18 at 19:20

3 Answers3

0

if you are truncating all tables in a DB and replacing with exactly the same structure, how about this approach:

Execute SQL:

select table_name
from INFORMATION_SCHEMA.TABLES --Add a where to limit the tables to the ones you want

Save results to an object variable called TABLES

Add a for each loop: Loop through ADO Object setting value to a string variable called table

Add Execute SQL to FE LOOP: truncate table ? and map parameter.

Add a 2nd Execute SQL statement:

INSERT INTO SERVER.DB.SCHEMA.?
select * from ?

Again map the parameters.

If you are having trouble mapping parameters set up variables and use them to create the SQL statements to run.

KeithL
  • 5,348
  • 3
  • 19
  • 25
0

@TomPhillips is correct, I cannot unfortunately comment or make that answer useful. hence commenting here.

There's no easy quick fix to use a loop/automate unless all the 50 tables are same structure which is rare by any stretch of imagination.

BIML is the way to go if you are lazy :)

DIWP
  • 181
  • 5
-1

SSIS is not dynamic. Data Flows require fixed input and output at compile time, not runtime. You cannot simply change the table name and have it work.

If you have a list of 50 tables to do the same function on, you can use BIML to dynamically generate the SSIS package(s). But the DF itself cannot be dynamic.

  • 2
    That is massively far from true. I use dynamic dataflows (for sources and destinations) all the time. This is very easy in SSDT, however, even in BIDS this was possible. – Thom A Aug 20 '18 at 13:16
  • @Larnu, while some things in a dataflow can be dynamic, I believe Tom is correct that the input and output columns are fixed at compile time. It's true that you can change the table name at runtime, but only if the columns are all exactly the same. However, I'm not sure that this answer at all addresses what the OP is really trying to ask. The question is unclear at this point. – Tab Alleman Aug 20 '18 at 14:44
  • @TabAlleman Yes, that is true. The definition of the datasets themselves need to be static, but a static definition of data doesn't mean the rest of the dataflow can't be dynamic. That intent isn't clear in their answer, however, as it appears to simply state the dataflow can't be dynamic. if that isn't the case, Tom, perhaps you could elaborate more/be more specific. – Thom A Aug 20 '18 at 14:47
  • As I said, the data flow requires fixed input and output. SSIS cannot handle dynamic input/output based on a file or schema change of the input or output, without recompiling the SSIS package. – Tom Phillips Dec 28 '18 at 17:53