1

I'm currently testing one application using sparkR. This are the my platform & application details:

Platform: Windows server 2008 SparkR version : R version 3.1.2 (2014-10-31) Spark Verion : version 1.4.1

What I did?

Step I: Load package into R environment

library(SparkR) -- Working

Step II: Set the system environment variables

Sys.setenv(SPARK_HOME = "C:\hdp\spark-1.4.1-bin-hadoop2.6") -- Working .libPaths(c(file.path(Sys.getenv("SPARK_HOME"), "R", "lib"),.libPaths()))

Step III: Create a spark context and a SQL context

sc <- sparkR.init(master = "local",sparkHome = "C:\hdp\spark-1.4.1-bin-hadoop2.6",appName = "TestSparR")

Getting error at this line i.e JVM is not ready after 10 seconds

Please, help me resolve this issue. Thanks.

Vijay_Shinde
  • 1,332
  • 2
  • 17
  • 38

2 Answers2

2

I had the same problem, and I can tell you I tried many many things.

But finally the following worked for me, after restarting my computer (and R and RStudio by the way):

SPARK_HOME <- "C:\\Apache\\spark-1.5.2-bin-hadoop2.6\\"
Sys.setenv('SPARKR_SUBMIT_ARGS'='"--packages" "com.databricks:spark-csv_2.10:1.2.0" "sparkr-shell"')
library(SparkR, lib.loc = "C:\\Apache\\spark-1.5.2-bin-hadoop2.6\\R\\lib")

library(SparkR)
library(rJava)

sc <- sparkR.init(master = "local", sparkHome = SPARK_HOME)

Maybe this can help: after restarting the system this was included in my environment variables PATH:

C:\ProgramData\Oracle\Java\javapath

quim
  • 61
  • 7
0

This worked for me

sparkPath <- 'C:/Users/YOUR PATH'
Sys.setenv(SPARK_HOME=sparkPath)
.libPaths(c(file.path(Sys.getenv('SPARK_HOME'), 'R', 'lib'), .libPaths()))
library(SparkR)
library(sparklyr)
sc <- spark_connect(master='local')

rad15f
  • 29
  • 3
  • If this doesn't work, it may be because you have a space in your path. example C:/Users/Rick Ross/etc., when you have a space in your path is sometimes glitchy. Try installing Docker! – rad15f Jan 25 '22 at 15:03