21

I have install below setup with version: Hadoop version 1.0.3 java version "1.7.0_67" Scala version 2.11.7 Spark version 2.1.1.

getting below error, can any one help me this.

root@sparkmaster:/home/user# spark-shell
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
17/07/05 01:07:35 WARN SparkContext: Support for Java 7 is deprecated as of Spark 2.0.0
17/07/05 01:07:36 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
17/07/05 01:07:37 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.

17/07/05 01:07:37 ERROR SparkContext: Error initializing SparkContext.
java.net.BindException: Cannot assign requested address: Service 'sparkDriver' failed after 16 retries (starting from 0)! Consider explicitly setting the appropriate port for the service 'sparkDriver' (for example spark.ui.port for SparkUI) to an available port or increasing 


<console>:14: error: not found: value spark
       import spark.implicits._

<console>:14: error: not found: value spark
       import spark.sql


Using Scala version 2.11.8 (Java HotSpot(TM) Client VM, Java 1.7.0_67)
Type in expressions to have them evaluated.
Type :help for more information.

scala> 
Pankaj Kumar
  • 213
  • 1
  • 2
  • 5
  • 1
    Can you please upgrade to Java 8 to get rid of _"WARN SparkContext: Support for Java 7 is deprecated as of Spark 2.0.0"_ and start over? – Jacek Laskowski Jul 05 '17 at 04:03

5 Answers5

62

There are a few different solutions

  1. Get your hostname

    $ hostname
    

    then try to assign your host name

    $ sudo hostname -s 127.0.0.1
    

    Start spark-shell.

  2. Add your hostname to your /etc/hosts file (if not present)

    127.0.0.1      your_hostname
    
  3. Add env variable

    export SPARK_LOCAL_IP="127.0.0.1" 
    
    load-spark-env.sh 
    
  4. Above steps solved my problem but you can also try to add

    export SPARK_LOCAL_IP=127.0.0.1 
    

    under the comment for local IP on template file spark-env.sh.template (/usr/local/Cellar/apache-spark/2.1.0/libexec/conf/)

    and then

    cp spark-env.sh.template spark-env.sh
    spark-shell
    
  5. If none of the above fixes, check your firewall and enable it, if not already enabled

Community
  • 1
  • 1
Alper t. Turker
  • 34,230
  • 9
  • 83
  • 115
  • 3
    Step 4 helped to fix my issue. Thanks! – Arun Jul 08 '19 at 13:31
  • Step 3 worked for me! – dgoodman1 Dec 22 '20 at 14:49
  • step 2 worked for me. – Rahul Goyal Oct 19 '21 at 06:42
  • Step 1 worked for me! In my case it was Cisco AnyConnect preventing host binding with this error: ERROR SparkContext: Error initializing SparkContext. java.net.BindException: Can't assign requested address: Service 'sparkDriver' failed after 16 retries (on a random free port)! Consider explicitly setting the appropriate binding address for the service 'sparkDriver' (for example spark.driver.bindAddress for SparkDriver) to the correct binding address. at java.base/sun.nio.ch.Net.bind0(Native Method) – The Code Guy Mar 29 '22 at 17:27
20

Add SPARK_LOCAL_IP in load-spark-env.sh as

export SPARK_LOCAL_IP="127.0.0.1"

The load-spark-env.sh file is located in spark/bin directory

Or you can add your hostname in /etc/hosts file as

127.0.0.1   hostname 

You can get your hostname by typing hostname in terminal

Hope this solves the issue!

koiralo
  • 22,594
  • 6
  • 51
  • 72
10
  • Had similar issue in my IntelliJ

    Reason : I was on cisco anyconnect VPN

    Fix : disconnected from the VPN, this issue did not appear

sonu1986
  • 211
  • 3
  • 4
  • 1
    I'm also on Intellij and VPN... however I do need my VPN. I tried setting *Allow local (LAN) access when using my VPN (if configured)*. And it worked. – Dici Feb 16 '19 at 11:02
  • Unfortunately, I am using PulseSecure which does not support this option for Allow local (LAN). So, have to disable VPN when working on Intellij IDE/PyCharm for SparkSession to work. – vmorusu Sep 25 '19 at 21:11
  • 1
    I had this issue with Cisco AnyConnect VPN but explicitly setting my hostname to 127.0.0.1 in /etc/hosts fixed the issue. – Tegan Snyder Feb 28 '20 at 03:27
  • I was wondering why sometimes it worked and sometimes it didn't, then I noticed that I was on VPN. Thank you, you saved much of my time – Christian Vincenzo Traina Jun 26 '20 at 10:29
2
  1. in your terminal by typing hostname you can have a look at your current hostname.
  2. vim /etc/hosts and set the hostname you get just now to your exact ip or 127.0.0.1.
linxx
  • 21
  • 1
0

When ever you will switch network while working with spark, It happens due to network switch.

for instant fix you have to add "spark.driver.bindAddress" in "localhost" mode or "127.0.0.1" in your spark application

// Create a SparkContext using every core of the local machine
val confSpark = new SparkConf().set("spark.driver.bindAddress", "localhost")
val sc = new SparkContext("local[*]", "appname", conf = confSpark)
Devbrat Shukla
  • 504
  • 4
  • 11