I am writing delta live tables notebooks in sql to access files from the data lake something like this:
CREATE OR REFRESH STREAMING LIVE TABLE MyTable
AS SELECT * FROM cloud_files("DataLakeSource/MyTableFiles", "parquet", map("cloudFiles.inferColumnTypes", "true"))
Whenever I need to access the Azure Data Lake I usually do something like this to set up the access:
service_credential = dbutils.secrets.get(scope="myscope",key="mykey")
spark.conf.set("fs.azure.account.auth.type.mylake.dfs.core.windows.net", "OAuth")
spark.conf.set("fs.azure.account.oauth.provider.type.mylake.dfs.core.windows.net", "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider")
spark.conf.set("fs.azure.account.oauth2.client.id.MyLake.dfs.core.windows.net", "99999999-9999-9999-9999-999999999")
spark.conf.set("fs.azure.account.oauth2.client.secret.mylake.dfs.core.windows.net", service_credential)
spark.conf.set("fs.azure.account.oauth2.client.endpoint.mylake.dfs.core.windows.net", "https://login.microsoftonline.com/99999999-9999-9999-9999-9999999999/oauth2/token")
Since I can't add a python cell like I have above to set up the access inside a sql delta live table notebook, how/where do I add the configuration for access to the data lake files?
I've thought about adding config info to the pipeline under configuration, but that of course won't work with the call to dbutils.secrets.get