1

I'm new to ADF and I'm trying to use the dynamic query to build a script where the outcome of one activity is inserted into another table.

enter image description here

On the "Reading Data" activity, I have a simple script that is reading few records off a table from Synapse serverless SQL pool, my desired outcome is to insert that outcome into a table via "Inserting into table" on Synapse dedicated SQL pool. Here is the dynamic query I have on "Inserting into table" activity:

@concat( 'INSERT INTO schema_name.table_name', '(', ' column1, column2, column3, column4 ', ')', ' VALUES (', activity('Reading data').output , ')' )

I keep getting the error:

Parse error at line: 6, column: 1020: Incorrect syntax near ']'.

Can you help me understand where is the error? I have no "]" in any of the scripts.

Here is the output of "Reading data" activity:

{ "resultSetCount": 1, "recordsAffected": 0, "resultSets": [ { "rowCount": 1, "rows": [ { "column1": 1, "column2": "testname=", "column3": "testslug", "column4": "2022-08-15" } ] } ], "outputParameters": {}, "outputLogs": "", "outputLogsLocation": "", "outputTruncated": false, "effectiveIntegrationRuntime": "AutoResolveIntegrationRuntime (East US)", "executionDuration": 6, "durationInQueue": { "integrationRuntimeQueue": 1 }, "billingReference": { "activityType": "PipelineActivity", "billableDuration": [ { "meterType": "xxxx", "duration": 0.016666666666666666, "unit": "Hours" } ] } }

And here is the input of "Insert into table" activity:

{ "scripts": [ { "type": "Query", "text": "INSERT INTO schemaname_tablename (\ncollumn1,\nColumn2,\nColumn3,\nColumn4\n){\"resultSetCount\":1,\"recordsAffected\":0,\"resultSets\":[{\"rowCount\":1,\"rows\":[{\"id\":1,\"name\":\"testname=\",\"Slug\":\"testslug\",\"created_at\":\"2022-08-15\"}]}],\"outputParameters\":{},\"outputLogs\":\"\",\"outputLogsLocation\":\"\",\"outputTruncated\":false,\"effectiveIntegrationRuntime\":\"AutoResolveIntegrationRuntime (US)\",\"executionDuration\":6,\"durationInQueue\":{\"integrationRuntimeQueue\":1},\"billingReference\":{\"activityType\":\"PipelineActivity\",\"billableDuration\":[{\"meterType\":\"xxxx\",\"duration\":0.016666666666666666,\"unit\":\"Hours\"}]}}" } ] }

SusanD
  • 143
  • 9
  • `activity('Reading data').output` would definitely have a `]` in it. Have you inspected this in the ADF log viewer? – Nick.Mc Aug 15 '22 at 21:27
  • in your question are missing some vital information, can ypu make a dbfiddle, so tjat we can see whaz for example activity actually does, because the entity is makeing your error – nbk Aug 15 '22 at 21:31
  • @Nick.McDermaid, hi, here is the outcome of "Reading data" activity. I inspected this and do see brackets but not sure how this fails to translate to values during INSERT in the "Inserting into table" activity: { "resultSetCount": 1, "recordsAffected": 0, "resultSets": [ { "rowCount": 1, "rows": [ { "id": 1, "name": "testname=", "Slug": "ghfj-iii", "created_at": "2022-08-03" } ] } ],. – SusanD Aug 15 '22 at 22:05
  • @nbk, Sure here is more context: "Reading data" activity reads: SELECT top 10 column1, column2, column3, column4 FROM tables_name "Insert into table" activity is supposed to insert the outcome of this query into a table. Does this help? – SusanD Aug 15 '22 at 22:10
  • @nbk Just updated the question description with more context including the Input & Output from ADF pipeline. – SusanD Aug 15 '22 at 22:19
  • Use Copy activity for this? – wBob Aug 15 '22 at 22:46
  • @wBob, the Copy activity had poor performance so I tried doing this by using the General -> Script activity. In general, I'd like to learn how to do this for purposes of educating myself on ADF too. – SusanD Aug 15 '22 at 22:49
  • 2
    Copy is the right activity for this. You say it’s slow but serverless is not built for sub-second performance- it’s better for exploration where you don’t necessarily know the schema. Or it could be your target, you always need to think about (at least) DWU, resource class, table geometry (distribution and index choice) and concurrency when dealing with dedicated sql pools. I wrote a little bit about Script activity [here](https://stackoverflow.com/a/71858876/1527504). – wBob Aug 15 '22 at 23:25
  • Ys this is a good learning activity. Basically you'll need to fiddle with quotes, escape sequences etc. in this expression: `@concat( 'INSERT INTO schema_name.table_name', '(', ' column1, column2, column3, column4 ', ')', ' VALUES (', activity('Reading data').output , ')' )` because it's not neatly wrapping your result up. – Nick.Mc Aug 16 '22 at 00:49
  • 1
    You need to experiment with the expression language and work out what it's escaping and what it isn't. You can never just lump a big dynamic expression together like this and expect it to work. Also note your insert column list has four columns but your values only has one value. for starters use `activity('Reading data').output.rows[0].id` to just pick one scalar value. – Nick.Mc Aug 16 '22 at 00:53
  • @Nick.McDermaid, thank you. What you noted is similar to what I eventually implemented in the ForEach activity variable. – SusanD Aug 16 '22 at 01:03

1 Answers1

1

Ok after following your feedback and some more digging on Microsoft Azure docs, here is the workflow that worked:

enter image description here

  1. I added a Pipeline variable called "rows" with Array data type.
  2. Then added a ForEach activity with this item:

@activity('Reading data').output.resultSets

This read the output of the "Reading data" activity in Synapse serverless sql pool.

In the ForEach -> Activity -> added a Set Variable activity

The item value for this variable is '@item().rows' This basically reads the object "rows" from the resultsSets array, which itself is an array. 3. Then connected the ForEach activity to another Script activity where I built this dynamic query:

`

@concat('INSERT INTO schema_name.table_name',
'(
column1,
column2,
column3,
column4
)',
' VALUES',
'(',
'''@{variables("rows")[0]}'',',
'''@{variables("rows")[1]}'',',
'''@{variables("rows")[2]}'',',
'''@{variables("rows")[3]}''',
')'
)

`

And it successfully wrote to the table in Synapse dedicated sql pool.

SusanD
  • 143
  • 9