0

I have installed the sparkR package and I am able to run other computation jobs like pi count or numbers of word counts in a document .But when I am trying to initiate sparkRSql job,it gives an error .Can anyone help me out ? I am using R version 3.2.0 and Spark 1.3.1

> library(SparkR)
> sc1 <- sparkR.init(master="local")
Launching java with command  /usr/lib/jvm/java-7-oracle/bin/java   -Xmx1g -cp '/home/himaanshu/R/x86_64-pc-linux-gnu-library/3.2/SparkR/sparkr-assembly-0.1.jar:' edu.berkeley.cs.amplab.sparkr.SparkRBackend /tmp/Rtmp0tAX4W/backend_port614e1c1c38f6 
15/07/09 18:05:51 WARN Utils: Your hostname, himaanshu-Inspiron-5520 resolves to a loopback address: 127.0.0.1; using 172.17.42.1 instead (on interface docker0)
15/07/09 18:05:51 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
15/07/09 18:05:52 INFO Slf4jLogger: Slf4jLogger started
15/07/09 18:05:54 WARN SparkContext: Using SPARK_MEM to set amount of memory to use per executor process is deprecated, please use spark.executor.memory instead.
> sqlContext <- sparkRSQL.init(sc1)
Error: could not find function "sparkRSQL.init"
````
user459
  • 111
  • 8

1 Answers1

1

You SparkR version is wrong. sparkr-assembly-0.1.jar has not contained sparkRSQL.init yet.

coladad
  • 26
  • 3