I'm trying to connect R to Spark following the sparklyr
tutorial from RStudio: http://spark.rstudio.com/
But some how, I'm getting a weird error message as below. Does anyone knows how to solve this ?
I have tried to add the C:\Windows\system32
path to the System Variables Path without any success. Thanks for your help.
> library(sparklyr)
> sc <- spark_connect(master = "local")
Error in sparkapi::start_shell(master = master, spark_home = spark_home, :
Failed to launch Spark shell. Ports file does not exist.
Path: C:\Users\Gaud\AppData\Local\rstudio\spark\Cache\spark-1.6.1-bin-hadoop2.6\bin\spark-submit.cmd
Parameters: --jars, "C:\Users\Gaud\Documents\R\win-library\3.3\sparklyr\java\sparklyr.jar", --packages, "com.databricks:spark-csv_2.11:1.3.0","com.amazonaws:aws-java-sdk-pom:1.10.34", sparkr-shell, C:\Users\Gaud\AppData\Local\Temp\RtmpC8MAa8\file322c47ee2a28.out