I'm trying to run the spark application on standalone mode, after installing the spark when i tried to run spark-submit command found the above error no java file found. Tried two different approaches.
- approach-1 : able to remove the additional '/' by changing the environment file but still the issue persists
- approach-2 : made the files having java home consistent but unable to find the spark.conf file where I could see to make it consistent.
- approach-3 : Tried to change the bash profile but no result either
Below is my bash_profile
export JAVA_HOME=/Library/Java/JavaVirtualMachines/jdk1.8.0_192.jdk/Contents/Home
export SPARK_HOME=/Users/xxxx/server/spark-2.3.0-bin-hadoop2.7
export SBT_HOME=/Users/xxxx/server/sbt
export SCALA_HOME=/Users/xxxx/server/scala-2.11.12
export PATH=$JAVA_HOME/bin:$SBT_HOME/bin:$SBT_HOME/lib:$SCALA_HOME/bin:$SCALA_HOME/lib:$PATH
export PATH=$JAVA_HOME/bin:$SPARK_HOME:$SPARK_HOME/bin:$SPARK_HOME/sbin:$PATH
export PYSPARK_PYTHON=python3
Here is my etc/environment file:
JAVA_HOME="/Library/Java/JavaVirtualMachines/jdk1.8.0_192.jdk/Contents/Home"
Could any help me in resolving this issue trying to run spark-scala application from 2 days on my Mac.