I am trying to read the delta log file content in an azure notebook and the code is failing, whereas the same code works in intellij in local. the dependency that i have in the cluster is:
delta-core_2.12:2.0.0
Code reference:
val configMap = Map("fs.azure.account.auth.type"->"OAuth",
"fs.azure.account.oauth2.client.endpoint" -> url,
"fs.azure.account.oauth2.client.id" -> clientId,
"fs.azure.account.oauth2.client.secret" -> clientKey,
"fs.azure.account.oauth.provider.type" -> ".....ClientCredsTokenProvider"
)
val basePath = "abfss:..."
val deltaTable = DeltaLog.forTable(spark, basePath, configMap);
The error msg i get
IllegalArgumentException: requirement failed: Config entry spark.databricks.delta.timeTravel.resolveOnIdentifier.enabled already registered!
at scala.Predef$.require(Predef.scala:281)
at org.apache.spark.internal.config.ConfigEntry$.registerEntry(ConfigEntry.scala:352)
at org.apache.spark.internal.config.ConfigEntry.<init>(ConfigEntry.scala:87)
at org.apache.spark.internal.config.ConfigEntryWithDefault.<init>(ConfigEntry.scala:133)
at org.apache.spark.internal.config.TypedConfigBuilder.createWithDefault(ConfigBuilder.scala:149)
at org.apache.spark.sql.delta.sources.DeltaSQLConfBase.$init$(DeltaSQLConf.scala:40)
at org.apache.spark.sql.delta.sources.DeltaSQLConf$.<init>(DeltaSQLConf.scala:782)
at org.apache.spark.sql.delta.sources.DeltaSQLConf$.<clinit>(DeltaSQLConf.scala)
at org.apache.spark.sql.delta.DeltaLog$.apply(DeltaLog.scala:573)
at org.apache.spark.sql.delta.DeltaLog$.forTable(DeltaLog.scala:496)
on the second execution of the code block the error changes to :
NoClassDefFoundError: Could not initialize class org.apache.spark.sql.delta.sources.DeltaSQLConf$
at org.apache.spark.sql.delta.DeltaLog$.apply(DeltaLog.scala:573)
at org.apache.spark.sql.delta.DeltaLog$.forTable(DeltaLog.scala:496)
Can someone tell me what am i missing in the azurenotebook that it is unable to read the deltalog content.