I am running spark on my local windows machine. It works perfectly fine when i set master as a local but when I give it a cluster master uri, It throws the following exception for each and every executor it initiates.
17/10/05 17:27:19 INFO StandaloneAppClient$ClientEndpoint: Executor updated: app-20171005172719-0006/0 is now FAILED (java.lang.IllegalStateException: Library directory 'C:\Spark\bin\.\assembly\target\scala-2.10\jars' does not exist; make sure Spark is built.)
I was trying out spark standalone environment locally. So I have started a master node and worker node and gave the master url to my driver program. I made sure my SPARK_HOME
environment variable is set to C:\Spark
(the location where I placed my spark).
Any help in solving this issue would be appreciated, Thanks.
Asked
Active
Viewed 490 times
0
-
Can you share the exact spark-submit command with arguments ? – Mahendra Singh Meena Oct 05 '17 at 13:38
-
spark-submit --class SparkApp --master spark://{IP}:7077 "D:\work\Examples\SparkExample\target\SparkExample-0.0.1-SNAPSHOT.jar" – Rakesh Oct 05 '17 at 13:42
-
Please add your code and spark-submit command – vaquar khan Oct 05 '17 at 15:04
1 Answers
0
I somehow managed to find the fix for this problem. The issue is caused by the path variable of spark home, It did not pick SPARK_HOME
(environment variable) when I added a path variable %SPARK_HOME%\bin
. I then removed that environment variable and path variable, added them again and restarted my system. IT WORKED.

Rakesh
- 466
- 3
- 12