93

I tried to start spark 1.6.0 (spark-1.6.0-bin-hadoop2.4) on Mac OS Yosemite 10.10.5 using

"./bin/spark-shell". 

It has the error below. I also tried to install different versions of Spark but all have the same error. This is the second time I'm running Spark. My previous run works fine.

log4j:WARN No appenders could be found for logger (org.apache.hadoop.metrics2.lib.MutableMetricsFactory).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Using Spark's repl log4j profile: org/apache/spark/log4j-defaults-repl.properties
To adjust logging level use sc.setLogLevel("INFO")
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 1.6.0
      /_/

Using Scala version 2.10.5 (Java HotSpot(TM) 64-Bit Server VM, Java 1.7.0_79)
Type in expressions to have them evaluated.
Type :help for more information.
16/01/04 13:49:40 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/04 13:49:40 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/04 13:49:40 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/04 13:49:40 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/04 13:49:40 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/04 13:49:40 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/04 13:49:40 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/04 13:49:40 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/04 13:49:40 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/04 13:49:40 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/04 13:49:40 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/04 13:49:40 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/04 13:49:40 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/04 13:49:40 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/04 13:49:40 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/04 13:49:40 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/04 13:49:40 ERROR SparkContext: Error initializing SparkContext.
java.net.BindException: Can't assign requested address: Service 'sparkDriver' failed after 16 retries!
    at sun.nio.ch.Net.bind0(Native Method)
    at sun.nio.ch.Net.bind(Net.java:444)
    at sun.nio.ch.Net.bind(Net.java:436)
    at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
    at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
    at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:125)
    at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:485)
    at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1089)
    at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:430)
    at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:415)
    at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:903)
    at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:198)
    at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:348)
    at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:357)
    at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)
    at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
    at java.lang.Thread.run(Thread.java:745)
java.net.BindException: Can't assign requested address: Service 'sparkDriver' failed after 16 retries!
    at sun.nio.ch.Net.bind0(Native Method)
    at sun.nio.ch.Net.bind(Net.java:444)
    at sun.nio.ch.Net.bind(Net.java:436)
    at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
    at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
    at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:125)
    at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:485)
    at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1089)
    at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:430)
    at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:415)
    at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:903)
    at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:198)
    at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:348)
    at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:357)
    at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)
    at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
    at java.lang.Thread.run(Thread.java:745)

java.lang.NullPointerException
    at org.apache.spark.sql.SQLContext$.createListenerAndUI(SQLContext.scala:1367)
    at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:101)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
    at org.apache.spark.repl.SparkILoop.createSQLContext(SparkILoop.scala:1028)
    at $iwC$$iwC.<init>(<console>:15)
    at $iwC.<init>(<console>:24)
    at <init>(<console>:26)
    at .<init>(<console>:30)
    at .<clinit>(<console>)
    at .<init>(<console>:7)
    at .<clinit>(<console>)
    at $print(<console>)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
    at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1346)
    at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
    at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
    at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
    at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
    at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
    at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
    at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:132)
    at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124)
    at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
    at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124)
    at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
    at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
    at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159)
    at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
    at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108)
    at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)
    at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
    at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
    at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
    at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
    at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
    at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
    at org.apache.spark.repl.Main$.main(Main.scala:31)
    at org.apache.spark.repl.Main.main(Main.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

<console>:16: error: not found: value sqlContext
         import sqlContext.implicits._
                ^
<console>:16: error: not found: value sqlContext
         import sqlContext.sql

Then I add

export SPARK_LOCAL_IP="127.0.0.1"

to spark-env.sh, error changes to:

 ERROR : No route to host
    java.net.ConnectException: No route to host
        at java.net.Inet6AddressImpl.isReachable0(Native Method)
        at java.net.Inet6AddressImpl.isReachable(Inet6AddressImpl.java:77)
        at java.net.InetAddress.isReachable(InetAddress.java:475)
...
<console>:10: error: not found: value sqlContext
       import sqlContext.implicits._
              ^
<console>:10: error: not found: value sqlContext
       import sqlContext.sql
Jia
  • 1,301
  • 1
  • 12
  • 18
  • On my working machine 10.10.5 - I get a (different) error as well if I export that variable, so I don't think it's a fix anyway. Without it it runs fine. Have you tried running at the DEBUG log level to get more info? – jbrown Jan 05 '16 at 12:07
  • 1
    This may because Spark doesn't work well with ipv6 right now. Could you try `bin/spark-shell --driver-java-options "-Djava.net.preferIPv4Stack=true"` – zsxwing Jan 05 '16 at 18:20
  • @zsxwing I tried and it has similar error`16/01/05 10:29:39 ERROR : No route to host 16/01/05 10:29:39 ERROR : No route to host java.net.ConnectException: No route to host at java.net.Inet4AddressImpl.isReachable0(Native Method) at java.net.Inet4AddressImpl.isReachable(Inet4AddressImpl.java:70) at java.net.InetAddress.isReachable(InetAddress.java:475) at java.net.InetAddress.isReachable(InetAddress.java:434) at tachyon.util.NetworkUtils.getLocalIpAddress(NetworkUtils.java:122) at tachyon.util.NetworkUtils.getLocalHostName(NetworkUtils.java:78)...` – Jia Jan 05 '16 at 18:31
  • Maybe some network configuration issue in your machine. Could you try `ping 127.0.0.1`? – zsxwing Jan 05 '16 at 18:47
  • @zsxwing I tried ping 127.0.0.1, and it works ok. – Jia Jan 08 '16 at 19:52
  • Do you have the full stack trace of `16/01/05 10:29:39 ERROR : No route to host 16/01/05 10:29:39 ERROR : No route to host java.net.ConnectException: No route to host at java.net.Inet4AddressImpl.isReachable0(Native Method)`? – zsxwing Jan 08 '16 at 20:13

13 Answers13

148

Following steps might help:

  1. Get your hostname by using "hostname" command.

  2. Make an entry in the /etc/hosts file for your hostname if not present as follows:

     127.0.0.1      your_hostname
    
starball
  • 20,030
  • 7
  • 43
  • 238
Gaurav Sharma
  • 1,584
  • 1
  • 11
  • 10
129

I always get that when switching between networks. This solves it:

$ sudo hostname -s 127.0.0.1

Ardavan
  • 1,896
  • 1
  • 13
  • 8
35

I've built it from the current master branch with version 2.0.0-SNAPSHOT. After adding export SPARK_LOCAL_IP="127.0.0.1" to load-spark-env.sh it worked for me. I'm using Macos 10.10.5. So it could be version issue?

OneCricketeer
  • 179,855
  • 19
  • 132
  • 245
meltac
  • 537
  • 4
  • 11
  • 3
    I was able to use `$ SPARK_LOCAL_IP="127.0.0.1" gradle my-gradle-task-using-local-spark`. The problem came up when using vpn. I'm using Macos 10.11.1. – Sergey Orshanskiy Oct 31 '16 at 21:05
33

Just set the spark.driver.host to be your localhost if you use IDE

SparkConf conf = new  SparkConf().setMaster("local[2]").setAppName("AnyName").set("spark.driver.host", "localhost");
JavaSparkContext sc = new JavaSparkContext(conf);
franklinsijo
  • 17,784
  • 4
  • 45
  • 63
Mohamed Ahmed
  • 331
  • 3
  • 2
  • what does local[2] mean? I just set the master to 'local' along with setting the spark.driver.host prop, and it fixed my issue. – medloh Apr 05 '17 at 16:11
  • Nice idea to add `spark.driver.host` directly in the code as opposed to messing with the `$SPARK_HOME/conf/spark-defaults.conf` – WestCoastProjects Nov 22 '17 at 17:06
  • This is the exact solution I'm looking for. It worked. Why should it be only if we are using IDE? will this have impact for non IDE tests? – Buddha Apr 17 '19 at 20:42
9

There are two errors I think.

  1. Your spark local ip was not correct and needs to be change to 127.0.0.1.
  2. You didn't difine sqlContext properly.

For 1. I tried:

  • 1) exported SPARK_LOCAL_IP="127.0.0.1" in ~/.bash_profile
  • 2) added export SPARK_LOCAL_IP="127.0.0.1" in load-spark-env.sh under $SPARK_HOME

But neither worked. Then I tried the following and it worked:

val conf = new SparkConf().
    setAppName("SparkExample").
    setMaster("local[*]").
    set("spark.driver.bindAddress","127.0.0.1")
val sc = new SparkContext(conf)

For 2. you can try:

sqlContext = SparkSession.builder.config("spark.master","local[*]").getOrCreate()

and then import sqlContext.implicits._

The builder in SparkSession will automatically use the SparkContext if it exists, otherwise it will create one. You can explicitly create two if necessary.

Rong Du
  • 291
  • 4
  • 3
9
export SPARK_LOCAL_IP=127.0.0.1

For mac .bash_profile.

nbro
  • 15,395
  • 32
  • 113
  • 196
alturium
  • 548
  • 5
  • 15
8

If you don't want to change the hostname of your Mac, you can do the following:

  1. Find the template file spark-env.sh.template on your machine (It is probably in /usr/local/Cellar/apache-spark/2.1.0/libexec/conf/).
  2. cp spark-env.sh.template spark-env.sh
  3. Add export SPARK_LOCAL_IP=127.0.0.1 under the comment for local IP.

Start spark-shell and enjoy it.

Dvin
  • 149
  • 1
  • 5
7

If you are using Scala to run the code in an IDE, and if you face the same issue and you are not using SparkConf() as pointed out above and using SparkSession() then you could bind the localhost address as follows as set only works in SparkConf(). You should use .config() to set the spark configuration as shown below:

    val spark = SparkSession
       .builder()
       .appName("CSE512-Phase1")
       .master("local[*]").config("spark.driver.bindAddress", "localhost")
       .getOrCreate()
3

This happens when you switched between different networks(VPN - PROD, CI based on your company networks to access different environments).

I had the same issue, whenever I switch the VPN.

update sudo /etc/hosts with the hostname value on your Mac.

Yoga Gowda
  • 357
  • 4
  • 8
2

Sometimes firewall prevents creating and binding a socket. make sure that your firewall is not enable and also you have to check the ip of your machine in /etc/hosts and make sure it's OK then try again:

sudo ufw disable
2
sparkContext = new JavaSparkContext("local[4]", "Appname")

export SPARK_LOCAL_IP=127.0.0.1

just doing above worked for me.

OneCricketeer
  • 179,855
  • 19
  • 132
  • 245
0

In Mac,Check IP in System Preference -> Network -> click the Wifi you are connected(it should show green icon) -> check the IP just above your Network Name.

Make following entry in ../conf/spark-env.sh :

SPARK_MASTER_HOST=<<your-ip>>
SPARK_LOCAL_IP=<<your-ip>>

and than try spark-shell. Doing above changes worked for me.

0

When ever you will switch network while working with spark, It happens due to network switch. for instant fix you have to add "spark.driver.bindAddress" in "localhost" mode or "127.0.0.1" in your spark application

// Create a SparkContext using every core of the local machine
val confSpark = new SparkConf().set("spark.driver.bindAddress", "localhost")
val sc = new SparkContext("local[*]", "appname", conf = confSpark)
Devbrat Shukla
  • 504
  • 4
  • 11