26

I try to execute simple project with Apache Spark. This is my code SimpleApp.scala

/* SimpleApp.scala */
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf

object SimpleApp {
  def main(args: Array[String]) {
    val logFile = "/home/hduser/spark-1.2.0-bin-hadoop2.4/README.md" // Should be some file on your system
    // val conf = new SparkConf().setAppName("Simple Application")
    val sc = new SparkContext("local", "Simple Job", "/home/hduser/spark-1.2.0-bin-hadoop2.4/")
    val logData = sc.textFile(logFile, 2).cache()
    val numAs = logData.filter(line => line.contains("hadoop")).count()
    val numBs = logData.filter(line => line.contains("see")).count()
    println("Lines with hadoop: %s, Lines with see: %s".format(numAs, numBs))
  }
}

when I manually send this job to Spark with command line : /home/hduser/spark-1.2.0-hadoop-2.4.0/bin/spark-submit --class "SimpleApp" --master local[4] target/scala-2.10/simple-project_2.10-1.0.jar it's run successfully.

if I run with sbt run and with the service apache spark is running, it's success, but in the end of log it give error like this :

15/02/06 15:56:49 ERROR Utils: Uncaught exception in thread SparkListenerBus
java.lang.InterruptedException
    at java.util.concurrent.locks.AbstractQueuedSynchronizer.doAcquireSharedInterruptibly(AbstractQueuedSynchronizer.java:996)
    at java.util.concurrent.locks.AbstractQueuedSynchronizer.acquireSharedInterruptibly(AbstractQueuedSynchronizer.java:1303)
    at java.util.concurrent.Semaphore.acquire(Semaphore.java:317)
    at org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1.apply$mcV$sp(LiveListenerBus.scala:48)
    at org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1.apply(LiveListenerBus.scala:47)
    at org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1.apply(LiveListenerBus.scala:47)
    at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1460)
    at org.apache.spark.scheduler.LiveListenerBus$$anon$1.run(LiveListenerBus.scala:46)
15/02/06 15:56:49 ERROR ContextCleaner: Error in cleaning thread
java.lang.InterruptedException
    at java.lang.Object.wait(Native Method)
    at java.lang.ref.ReferenceQueue.remove(ReferenceQueue.java:135)
    at org.apache.spark.ContextCleaner$$anonfun$org$apache$spark$ContextCleaner$$keepCleaning$1.apply$mcV$sp(ContextCleaner.scala:136)
    at org.apache.spark.ContextCleaner$$anonfun$org$apache$spark$ContextCleaner$$keepCleaning$1.apply(ContextCleaner.scala:134)
    at org.apache.spark.ContextCleaner$$anonfun$org$apache$spark$ContextCleaner$$keepCleaning$1.apply(ContextCleaner.scala:134)
    at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1460)
    at org.apache.spark.ContextCleaner.org$apache$spark$ContextCleaner$$keepCleaning(ContextCleaner.scala:133)
    at org.apache.spark.ContextCleaner$$anon$3.run(ContextCleaner.scala:65)

Any wrong in my code? Thanks in advance. I use apache spark 1.2.0-bin-hadoop-2.4, scala 2.10.4

030
  • 10,842
  • 12
  • 78
  • 123
harushime
  • 407
  • 1
  • 4
  • 10

2 Answers2

49

The SparkContext or SparkSession (Spark >= 2.0.0) should be stopped when the Spark code is run by adding sc.stop or spark.stop (Spark >= 2.0.0) at the end of the code.

030
  • 10,842
  • 12
  • 78
  • 123
M.Rez
  • 1,802
  • 2
  • 21
  • 30
24

According this mail archive, i.e.:

Hi Haoming,

You can safely disregard this error. This is printed at the end of the execution when we clean up and kill the daemon context cleaning thread. In the future it would be good to silence this particular message, as it may be confusing to users.

Andrew

the error could be disregarded.

030
  • 10,842
  • 12
  • 78
  • 123
Shyamendra Solanki
  • 8,751
  • 2
  • 31
  • 25
  • ohh i see sir. i think the problem come from my code, but in fact it's log for cleaning the job of spark. thankyou sir. – harushime Feb 06 '15 at 10:05
  • Is this issue resolved? I'm running hive-on-spark with spark-1.5.1 and hive-1.2.1 in yarn-cluster mode. Receiving the same error "uncaught error in thread SparkListenerBus, stopping SparkContext java.lang.AbstractMethodError " – Arvindkumar Oct 22 '15 at 11:50
  • @Arvindkumar that issue is resolved and now, i can use it normally. – harushime Feb 06 '16 at 08:26