I am new to Delalake. I was trying a simple example.
- Create dataframe from a csv
- Save it is as delta table
- Read it again.
It works fine. I can see the files are created in the default spark-warehouse folder.
But Next time I just want to read the saved table. So I comment code for the first two septs and re-run the program I get
Analysis Exception:Table or view not found
val transHistory = spark.
read
.option("header", "true")
.option("inferschema", true)
.csv(InputPath + "trainHistory.csv");
transHistory.write.format("delta").mode(SaveMode.Overwrite).saveAsTable("transactionshistory")
val transHistoryTable = spark.read.format("delta").table("transactionshistory")
transHistoryTable.show(10)
I am using delta lake 0.8.0, Spark 3.0, and scala 2.12.13