I mounted a blob storage container (confirmed blob type):
dbutils.fs.mount(
source = "wasbs://mycontainer@myblobstorageaccount.blob.core.windows.net",
mount_point = "/mnt/mymountpoint",
extra_configs = {"fs.azure.sas.mycontainer.myblobstorageaccount.blob.core.windows.net":dbutils.secrets.get(scope = "mydatabricksscope", key = "nameofmysecretinkeyvault")})
The mount point is created and I can see it in databricks. I placed a csv file in this container as well as a file that contains further csv files but cannot access anything:
dbutils.fs.ls("/mnt/mymountpoint/")
java.io.FileNotFoundException: / is not found
dbutils.fs.ls("wasbs://mycontainer@mystorageblob.blob.core.windows.net/")
shaded.databricks.org.apache.hadoop.fs.azure.AzureException: shaded.databricks.org.apache.hadoop.fs.azure.AzureException: Container ownbackupsalesforcedevemastercnt in account ownbackupsalesforcedev.blob.core.windows.net not found, and we can't create it using anoynomous credentials, and no credentials found for them in the configuration.
df = spark.read.format('csv').load('/mnt/mymountpoint/mycsv.csv', header="true")
AnalysisException: Path does not exist: dbfs:/mnt/ownbackupsalesforcemnt/accounts.csv
I've redone the secret scope in databricks and unmounted and remounted this several times but still cannot get in. Can anyone please help me?