0

I am trying to submit the application through livy batches through the Postman POST call. I do see the following error in the logs. But i am able to execute the commands in the interactive livy sessions through Curl.

I checked the HADOOP_CONF_DIR property in livy2-env and i set the same value as Spark2-env

Please suggest if i am missing any

"Diagnostics: File does not exist: hdfs://<host:port>/user/livy/.sparkStaging/application_TimeStamp/__spark_conf__.zip"

OneCricketeer
  • 179,855
  • 19
  • 132
  • 245
pradz_stack
  • 176
  • 1
  • 11
  • Livy needs to know where `${SPARK_HOME}/conf` is – OneCricketeer Feb 10 '18 at 00:40
  • Thank you for responding. I have checked the configuration and i see the following SPARK_HOME=/usr/hdp/current/spark2-client SPARK_CONF_DIR=/etc/spark2/conf .But i could see the link in the /etc/spark2/conf pointing to the Spark Config. Please suggest – pradz_stack Feb 11 '18 at 06:23
  • Not sure, really. I haven't ran a version of HDP that supports Spark2 + Livy. You might want to try the Hotronworks support forums – OneCricketeer Feb 11 '18 at 18:40

1 Answers1

0

You need to set path of all the hdfs files containing folder at HADOOP_CONF_DIR. In my system the path is /usr/local/hadoop/etc/hadoop. Just check in yours and set it. I hope it'll work.

Dmitriy
  • 5,525
  • 12
  • 25
  • 38