0

I started to look into delta lake and got this exception when trying to update a table.

I'm using:

aws EMR 5.29

Spark 2.4.4

Scala version 2.11.12 and using io.delta:delta-core_2.11:0.5.0.

import io.delta.tables._
import org.apache.spark.sql.functions._
import spark.implicits._

val deltaTable = DeltaTable.forPath(spark, "s3://path/")

deltaTable.update(col("col1") === "val1", Map("col2" -> lit("val2")));

java.lang.NoSuchMethodError: org.apache.spark.sql.catalyst.analysis.Analyzer.execute(Lorg/apache/spark/sql/catalyst/plans/logical/LogicalPlan;)Lorg/apache/spark/sql/catalyst/plans/logical/LogicalPlan;
  at org.apache.spark.sql.delta.util.AnalysisHelper$class.tryResolveReferences(AnalysisHelper.scala:33)
  at io.delta.tables.DeltaTable.tryResolveReferences(DeltaTable.scala:42)
  at io.delta.tables.execution.DeltaTableOperations$$anonfun$5.apply(DeltaTableOperations.scala:93)
  at io.delta.tables.execution.DeltaTableOperations$$anonfun$5.apply(DeltaTableOperations.scala:93)
  at org.apache.spark.sql.catalyst.plans.logical.UpdateTable$$anonfun$1.apply(UpdateTable.scala:57)
  at org.apache.spark.sql.catalyst.plans.logical.UpdateTable$$anonfun$1.apply(UpdateTable.scala:52)
  at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
  at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
  at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
  at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
  at scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
  at scala.collection.AbstractTraversable.map(Traversable.scala:104)
  at org.apache.spark.sql.catalyst.plans.logical.UpdateTable$.resolveReferences(UpdateTable.scala:52)
  at io.delta.tables.execution.DeltaTableOperations$class.executeUpdate(DeltaTableOperations.scala:93)
  at io.delta.tables.DeltaTable.executeUpdate(DeltaTable.scala:42)
  at io.delta.tables.DeltaTable.updateExpr(DeltaTable.scala:361)
  ... 51 elided

any idea why?

Thanks!

thebluephantom
  • 16,458
  • 8
  • 40
  • 83
Guy Harari
  • 11
  • 1

2 Answers2

2

Sorry for the inconvenience, but this is a bug in the version of Spark bundled with emr-5.29.0. It will be fixed in emr-5.30.0, but in the meantime you can use emr-5.28.0, which does not contain this bug.

Jonathan Kelly
  • 1,940
  • 11
  • 14
0

This is usually because you are using an incompatible Spark version. You can print sc.version to check your Spark version.

zsxwing
  • 20,270
  • 4
  • 37
  • 59