Currently, I am trying to copy data from Azure Databricks Delta Lake to my Azure Data Lake through Azure Data Factory. I want to copy to a dynamic directory and a dynamic file name, but I keep receiving this error "Direct copying data from Azure Databricks Delta Lake is only supported when sink is a folder, please enable staging or fix File path" however, when I chose a fix file path and a fix container the pipeline works, but I am not able to copying in different directories. Does anyone face something like this? Who can advise.
However, I tried to enable staging and the error is fixed, but I get another error Databricks is not authorized to perform this operation using this permission xxx 403...