0

I have Spark 1.6.2 installed on my system. Also I am using R(3.4.0) with rstudio-server 1.0.143 in CentOS 6.9 machine.

Whenever I am running the command,

sc <- spark_connect(master = "local")

it shows an error message stating that:

Error in spark_version_from_home(spark_home, default = spark_version) : Failed to detect version from SPARK_HOME or SPARK_HOME_VERSION. Try passing the spark version explicitly.

halfer
  • 19,824
  • 17
  • 99
  • 186
  • 1
    Please read [Under what circumstances may I add “urgent” or other similar phrases to my question, in order to obtain faster answers?](//meta.stackoverflow.com/q/326569) - the summary is that this is not an ideal way to address volunteers, and is probably counterproductive to obtaining answers. Please refrain from adding this to your questions. – halfer Jun 30 '17 at 12:49
  • 1
    You'll need to specify your SPARK_HOME manually, sparklyr has sometimes troubles detecting it from the system. – eliasah Jun 30 '17 at 14:31
  • @halfer I am meeting a deadline which is why I added the phrase 'as soon as possible'. I have full respect for this forum but I personally do not think that using these phrases should be done away with. Everyone needs answers in the shortest possible time. So do I. Anyways, I will take of this next time. Also, thank you for informing me about this. :) – Pulkit Joshi Jul 01 '17 at 05:22

0 Answers0