0

In Azure Databricks. I have a unity catalog metastore created on ADLS on its own container (metastore@stgacct.dfs.core.windows.net/) connected w/ the Azure identity. Works fine.

I have a container on the same storage account called data. I'm using Notebook-scoped creds to gain access to that container. Using abfss://data@stgacct... Works fine.

Using the python Delta API, I'm creating an object for my DeltaTable using: deltaTable = DeltaTable.forName(spark, "mycat.myschema.mytable"). I'm able to perform normal Delta functions using that object like MERGE. Works fine.

However, if I attempt to run the deltaTable.detail() command, I get the error: "Your query is attempting to access overlapping paths through multiple authorization mechanisms, which is not currently supported."

It's as if Spark doesn't know which credential to use to fulfill the .detail() command; the metastore identity or the SPN I used when I scoped my creds for the data container - which also has rights to the metastore container.

To test: If I restart my cluster, which drops the spark conf for ADLS, and I attempt to run the command deltaTable = DeltaTable.forName(spark, "mycat.myschema.mytable") and then deltaTable.detail(), I get the error "Failure to initialize configurationInvalid configuration value detected for fs.azure.account.key" - as if it's not using the metastore credentials which I would have expected since it's a unity/managed table (??).

Suggestions?

Alex Ott
  • 80,552
  • 8
  • 87
  • 132
ExoV1
  • 97
  • 1
  • 7

0 Answers0