0

I am trying to use spark 3.2.0, and delta-Core-1.1.0.jar, and am getting following error

org.apache.spark.sql.AnalysisException: default.testData is not a Delta table.

This is how I am saving the dataset: tableDataset.write().option("path", clientId + "/" + jobname + "/" + tableName).format("delta").saveAsTable(tableName);

When I use the spark 3.0, and delta-core-0.8.jar, code works fine.

Kindly let us know what needs to be modified wrt upgraded Spark.

Alex Ott
  • 80,552
  • 8
  • 87
  • 132
Daksh
  • 148
  • 11
  • 3
    please post full exception – Alex Ott Feb 14 '22 at 10:32
  • 1
    Was able to solve the issue, it was due to spark metastore was not getting set. When we removed ".option("path", clientId + "/" + jobname + "/" + tableName)", issue got solved – Daksh Feb 14 '22 at 13:22

0 Answers0