I am very frustrated by Spark. An evening wasted thinking that I was doing something wrong but I have uninstalled and reinstalled several times, following multiple guides that all indicate a very similar path.
On cmd prompt, I am trying to run:
pyspark
or
spark-shell
The steps I followed include downloading a pre-built package from:
https://spark.apache.org/downloads.html
including spark 2.0.2 with hadoop 2.3 and spark 2.1.0 with hadoop 2.7.
Neither work and I get this error:
'Files\Spark\bin\..\jars""\' is not recognized as an internal or external command,
operable program or batch file.
Failed to find Spark jars directory.
You need to build Spark before running this program.
I've setup my environment variables fine as well utilising the winutils.exe trick but these seem unrelated to the problem at hand.
I can't be the only one who's stuck on this problem. Anyone know a work around for getting this program to work in windows?