Check weather in this path is there any data Available or not /mnt/container-name/folder-name
and also check storage connection between the data lake and data bricks .
Otherwise Create mount and Create a storage connection between the data lake and data bricks.Follow below steps.
spark.conf.set("fs.azure.account.key.blobstoragename.dfs.core.windows.net",dbutils.secrets.get(scope="scopename",key="keyvalue"))
Create mount
dbutils.fs.mount(
source = "wasbs://containername@blobstoragename.blob.core.windows.net",
mount_point = "/mnt/iotd/fgh",
extra_configs = {"fs.azure.account.key. blobstoragename.blob.core.windows.net":" past Access key "})
For example, to overwrite the data in a table you can:
df.write.format("delta").mode("overwrite").save("/mnt/container-name/folder-name")
Reference:
https://docs.databricks.com/data/data-sources/azure/adls-gen2/azure-datalake-gen2-get-started.html
Table batch reads and writes - Azure Databricks | Microsoft Docs
Table batch reads and writes — Delta Lake Documentation
https://www.youtube.com/watch?v=cbobqI3ZGuA