1

I have some question about YARN?

How can i run my jar file on YARN ?

Exception in thread "main" java.lang.Exception: When running with master 'yarn-cluster' either HADOOP_CONF_DIR or YARN_CONF_DIR must be set in the environment.

Should i copy Spark on every node in this SuperComputer?

in this SuperComputer, all nodes are interconnected, and it has own Architecture.

is there any one who could run a Spark application on a SuperComputer?

AHAD
  • 239
  • 4
  • 16
  • what type of file system does your super computer have? local or shared? – Crackerman Jul 31 '15 at 17:13
  • it has shared file system. – AHAD Jul 31 '15 at 17:14
  • Then you only need to have it put in one place. If you ssh to the different nodes in the cluster, you should be able to see the hadoop installation. I believe your issue is that HADOOP_CONF_DIR needs to be set in the .bashrc of the user you are running the jar as. I am assuming that you are running some version of Linux. What is your OS? – Crackerman Jul 31 '15 at 17:16

0 Answers0