2

Recently, Databricks launched Databricks Connect that

allows you to write jobs using Spark native APIs and have them execute remotely on an Azure Databricks cluster instead of in the local Spark session.

It works fine except when I try to access files in Azure Data Lake Storage Gen2. When I execute this:

spark.read.json("abfss://...").count()

I get this error:

java.lang.RuntimeException: java.lang.ClassNotFoundException: Class shaded.databricks.v20180920_b33d810.org.apache.hadoop.fs.azurebfs.SecureAzureBlobFileSystem not found   at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2195)

Does anybody know how to fix this?

Further information:

flappy
  • 173
  • 1
  • 4
  • 12

2 Answers2

1

If you mount the storage rather use a service principal you should find this works: https://docs.databricks.com/spark/latest/data-sources/azure/azure-datalake-gen2.html

I posted some instructions around the limitations of databricks connect here. https://datathirst.net/blog/2019/3/7/databricks-connect-limitations

simon_dmorias
  • 2,343
  • 3
  • 19
  • 33
  • This is a useful workaround, thanks! Do you know if there is any way to report this issue (and/or other limitations you found) to Databricks? – tomconte Sep 25 '20 at 08:26
0

Likely too late but for completeness' sake, there's one issue to look out for on this one. If you have this spark conf set, you'll see that exact error (which is pretty hard to unpack):

fs.abfss.impl org.apache.hadoop.fs.azurebfs.SecureAzureBlobFileSystem

So you can double check the spark configs to make sure you have the permissions to directly access ADLS gen2 using the storage account access key.