0

I have Scala IDE Problem. The code below

val conf = new SparkConf().setAppName("xxx").setMaster("local[*]")

running in Scala IDE works fine, But

val conf = new SparkConf().setAppName("xxx").setMaster("spark://ipOfMyPC:7077")

can't work. and error message is

WARN TaskSchedulerImpl: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient memory

I have checked with Spark-Shell, Spark-Shell Web UI use port 4040 and work fine. That's the reason I found Executor did not start.

Scala IDE service SparkUI use port 4041 automatically and I found Executor did not start, only driver exists. I have tried this code below, but not work

val conf = new SparkConf().setAppName("xxx").set("spark.executor.instances", "2").set("spark.executor.memory", "1g").setMaster("spark://ipOfMyPC:7077")

How to solve this issue in Scala IDE?

My Platform is windows 8.1 and firewall is disabled. Thank you very much.

chunhunghan
  • 51,087
  • 5
  • 102
  • 120

2 Answers2

0

Go to spark UI : yourip:8080

find the link of master url like : spark://xyz:7077

use this as master url.

use start-all.sh command to start all nodes.

Nilesh
  • 2,054
  • 3
  • 23
  • 43
0

Although Scala IDE service SparkUI use port 4041 instead 4040 automatically. After Stop Spark-Shell(on port 4040), The Scala IDE job can run successfully.

chunhunghan
  • 51,087
  • 5
  • 102
  • 120