0

I am running scala_script.scala from unix cli and getting error of: enycription key missing, while the same scala code runs fine in spark-shell.

The code is used to access a hive table load it in dataframe, process some transformation and then again: write.mode("append/overwrite").saveAsTable("my_db.my_table"). Code is:

import org.apache.spark.sql.hive.HiveContext;
val hc = new org.apache.spark.sql.hive.HiveContext(sc)
val tb_applicant_details=hc.sql("SELECT * FROM staging_mps_25.applicant_details")
tb_applicant_details.write.mode("overwrite").insertInto("spark_tests.new_test_person")
Iskuskov Alexander
  • 4,077
  • 3
  • 23
  • 38

1 Answers1

0

A good approach is to use spark-shell's :load "path_to_script". another is to give correct drivers --driver and set --master yarn in spark-shell -i command, for runing script. Like, spark-shell -i --master yarn --driver"com.correct.driver" complete/absolute/path/to/script.scala