0

I am developing Spark notebooks in Microsoft Azure Synapse. I can easily pass in a parameter with the base mount point for accessing files in Datalake gen2 storage, but I would just like to use the linked service as defined in the workspace instead to find the mount point.

This will make migrating to stage and production easier. Each environment is a different subscription and storage account.

Does anybody have a way of doing this or a best practice.

Thanks

When I started I just coded the mount point in the notebook. But obviously, this is not acceptable as we want to promote the code

Daniel Mann
  • 57,011
  • 13
  • 100
  • 120

1 Answers1

0

You can mount ADLS Gen2 storage account using created linked service instead of mount path. ADLS Linked service:

enter image description here

Use below code to create mount:

mssparkutils.fs.mount(
     "abfss://<containerName>@<ADLSAccName>.dfs.core.windows.net",
     "<mounting container>",
     {"linkedService":"<linkedServiceName>"}  
)

enter image description here

Bhavani
  • 1,725
  • 1
  • 3
  • 6