0

I am trying to use below code to write the data to synapse dedicated sql pool table.

The Data is stored in ADLS Gen2 and I am trying to write a dataframe into a sql table

I also have a service principal created for Azure Databricks that I am also using in Synapse as db_owner,

enter image description here

While running the code - I get below error:

java.lang.IllegalArgumentException: Could not retrieve Credential Passthrough token. Please check if Credential Passthrough is enabled

Can some please help explain what is wrong here because my cluster shows passthrough enabled enter image description here

azuresnowflake1
  • 135
  • 1
  • 10
  • Check this [document](https://chinnychukwudozie.com/2020/11/13/write-data-from-azure-databricks-to-azure-synapse-analyticsformerly-sql-dw-using-adls-gen-2/) if it helps you. – Pratik Lad May 10 '23 at 20:10

1 Answers1

0

I got same error when I tried in my environment.

enter image description here

You can see here, I have not enabled the credential passthrough for the cluster and still got the error.

enter image description here

To resolve this error, make sure you have the correct permissions as mentioned in this documentation.

And also, you can try with two service principals one for the storage account another for the synapse workspace. Add these in the databricks ini script as mentioned in this documentation.

Naveen Sharma
  • 349
  • 2
  • 4
  • Hi - I tried this and was able to see that connection was made now and temp directory was populated but after that I get this 403 unauthorized error - can't access external location error... Can you please suggest on the same? – azuresnowflake1 May 13 '23 at 06:08
  • 403 errors means you don't have access to it. Make sure your service principals had correct role in synapse. – Rakesh Govindula May 13 '23 at 10:45