In Data Factory, can we have the output from 'Set Variable' activity being logged as a json file?
4 Answers
Another simplest way to achieve this requirement is by utilizing "Add additional columns during copy" feature as below.
Have a set variable activity and set the value of the variable, followed by a copy activity. In copy activity, Source
settings, you have Additional columns
property where you can give a name to source variable column. Using Dynamic expression @variables('varInput')
you assign the variable value. Then in Mapping
section, you can remove unwanted columns and only have the required columns including variable column that you created in Additional columns
of Source
. Then on the destination side give your desired column name and test it.
NOTE: This feature works with the latest dataset model. If you don't see this option from the UI, try creating a new dataset.
Hope this helps.

- 1,288
- 5
- 10
-
so in source properties you have selected: File path in dataset but your source is not a file but a variable. So what are you giving in FolderPath and FileName in source dataset? Because these fields are mandatory. – Bilal Shafqat Oct 02 '20 at 15:04
-
This worked thanks! @BilalShafqat you can create an empty json file in your data source and use the path of that file in source. – Stramzik Mar 22 '21 at 07:36
-
How are you making the GIFs just out of interest, @KranthiPakala-MSFT? – wBob Jun 17 '21 at 16:10
-
Hey @wBob you can try this out - https://www.screentogif.com/ – Kranthi Pakala Jun 17 '21 at 19:19
-
2@KranthiPakala-MSFT this works only when you have an existing json file, what to do when you don't have an existing file in source dataset and just want to save a bunch of variables to a file in json format. – codeomnitrix Jul 09 '21 at 12:33
No built-in easy way for your need as i know.
2 ways as workarounds:
1.Use enable Azure Monitor diagnostic log in ADF to log data into Azure Blob Storage as JSON files.And every activity's execution details(contains output) could be logged in the file.However,you need to get know the structure of json schema and grab what you want.
2.Use Azure Function or Web Activity after Set Variable Activity to call API(@activity('Set Variable1').output). Save the output into residence as json file in the function method with SDK code.

- 23,163
- 2
- 27
- 32
-
Thank you for the response. Apparently I need to gain access to Azure Function yet, so I cannot test it now. I am still wondering how achieve that though (which method to use, etc). – OreoFanatics Mar 17 '20 at 07:05
-
@OreoFanatics Are you able to pass parameters into Azure Function,if so,`@activity('Set Variable1').output` is the parameter exactly. Then inside Azure Function,you grab that param and store it into json file using blob storage sdk. – Jay Gong Mar 17 '20 at 07:08
-
Just wanted to share this info: We cannot use `@activity('Set Variable1').output` in the subsequent activities as the output of activity 'Set variable1' can't be referenced since it has no output. – Kranthi Pakala Apr 22 '20 at 23:11
I generally use the Copy activity for writing files but it is possible to write content to Azure Data Lake (ADLS) Gen 2 using the Blob REST API and PUT
command. The settings in the Web activity are crucial to this working:
Setting | Value | Notes |
---|---|---|
URL | some blob | NB this is using the .blob address not the .dfs one. The path must end in ?resource=file |
Method | PUT | |
Headers | ||
x-ms-version | 2019-07-07 | |
x-ms-blob-type | BlockBlob | |
Content-Type | application/json | This value is for writing json but can be customised eg application/csv |
Body | @variables('varResult') | I'm using a pre-prepared variable with json content but this can be anything |
Authentication | Managed Identity | |
Resource | https://storage.azure.com |
Note you must set the URL to the file name and folder you want and use the .blob address. The URL must end with ?resource=file
:
Example URL / Blob address https://yourstorage.blob.core.windows.net/yourFilesystem/yourFolder/someFile.json?resource=file
Note also I'm writing json here but you can amend as required, eg application/csv
. I am using a variable in the Body but this can be anything you like. The documentation states this only supports files up to 2GB so this is only for small activities.
I wasn't able to get this to work with the .dfs
address and/or Data Lake methods but it's fine as long as it works on blob.

- 13,710
- 3
- 20
- 37
If you want to write the content of a variable of type Array, there is a workaround which works fine. Goal: write content of your array as 1 line per value of the array into a file
variable : [ a, b, c]
to
file content:
a
b
c
Steps:
- Create an 'empty' file with 1 row , can be a json file or something else with just 1 row
- Use the additional column mechanism
- Expand-join-with-carriagereturn the array variable with use of @join and @decodeUriComponent -> @join(variable,decodeUriComponent('%0A'))
Yes, it is horrible that Microsoft doesn't have an @char(int) function to create a special character.(or I am an idiot and doesn't know the right way to concat an '\n' , which I tried but didn't work.)

- 1