0

I am running spark on my local windows machine. It works perfectly fine when i set master as a local but when I give it a cluster master uri, It throws the following exception for each and every executor it initiates.
17/10/05 17:27:19 INFO StandaloneAppClient$ClientEndpoint: Executor updated: app-20171005172719-0006/0 is now FAILED (java.lang.IllegalStateException: Library directory 'C:\Spark\bin\.\assembly\target\scala-2.10\jars' does not exist; make sure Spark is built.)

I was trying out spark standalone environment locally. So I have started a master node and worker node and gave the master url to my driver program. I made sure my SPARK_HOME environment variable is set to C:\Spark (the location where I placed my spark).

Any help in solving this issue would be appreciated, Thanks.

SudhirR
  • 760
  • 4
  • 11
Rakesh
  • 466
  • 3
  • 12

1 Answers1

0

I somehow managed to find the fix for this problem. The issue is caused by the path variable of spark home, It did not pick SPARK_HOME(environment variable) when I added a path variable %SPARK_HOME%\bin. I then removed that environment variable and path variable, added them again and restarted my system. IT WORKED.

Rakesh
  • 466
  • 3
  • 12