2

When I am using the above syntax in "Execute row script" step...it is showing success but the temporary table is not getting created. Plz help me out in this.

user3374000
  • 51
  • 3
  • 5
  • According to [this](http://stackoverflow.com/questions/3887989/global-temporary-tables-in-sql-server) global temp tables go away when all connections to them go out of scope. Once the transform ends, there will be no temp table any more. Please tell us more about what you're doing and how you want to use that temp table. – Brian.D.Myers Mar 04 '14 at 04:01
  • We are using Select statement instead of table...for example "select * into ##Temp from (select * from ABC)X". After this we are using information_schema.COLUMNS to fetch the data type & other information of the columns using this global temporary table. – user3374000 Mar 04 '14 at 08:40
  • 1
    Actually if we execute select * into ##Temp from (select * from ABC)X; statement directly in the MSSQL database then it is working fine but if we try to execute it through Pentaho data integration(PDI) tool then global temp table is not getting created. Moreover PDI does not give any error also...its running successfully but Global temp table is not getting created. Please help me out in this. – user3374000 Mar 04 '14 at 08:46

1 Answers1

2

Yes, the behavior you're seeing is exactly what I would expect. It works fine from the TSQL prompt, throws no error in the transform, but the table is not there after transform completes.

The problem here is the execution model of PDI transforms. When a transform is run, each step gets its own thread of execution. At startup, any step that needs a DB connection is given its own unique connection. After processing finishes, all steps disconnect from the DB. This includes the connection that defined the temp table. Once that happens (the defining connection goes out of scope), the temp table vanishes.

Note, that this means in a transform (as opposed to a Job), you cannot assume a specific order of completion of anything (without Blocking Steps).

We still don't have many specifics about what you're trying to do with this temp table and how you're using it's data, but I suspect you want its contents to persist outside your transform. In that case, you have some options, but a global temp table like this simply won't work.

Options that come to mind:

  1. Convert temp table to a permanent table. This is the simplest solution; you're basically making a staging table, loading it with a Table Output step (or whatever), and then reading it with Table Input steps in other transforms.
  2. Write table contents to a temp file with something like a Text File Output or Serialze to File step, then reading it back in from the other transforms.
  3. Store rows in memory. This involves wrapping your transforms in a Job, and using the Copy Rows to Results and Get Rows from Results steps.

Each of these approaches has its own pros and cons. For example, storing rows in memory will be faster than writing to disk or network, but memory may be limited.

Another step it sounds like you might need depending on what you're doing is the ETL Metadata Injection step. This step allows you in many cases to dynamically move the metadata from one transform to another. See the docs for descriptions of how each of these work.

If you'd like further assistance here, or I've made a wrong assumption, please edit your question and add as much detail as you can.

Brian.D.Myers
  • 2,448
  • 2
  • 20
  • 17