1

I am trying to launch a Spark job (Spark 1.4.0) on a cluster. Both from the command line and Eclipse, I get an error about withDummyCallSite function being missing in Spark Utils class. In maven dependencies, I can see that spark-core_2.10-1.4.0.jar is loaded, which is supposed to include this function. I am running Java 1.7, same as the Java version against which the code was previously compiled. I can see on the Spark Master monitor that the job has launched, so it doesn't seem to be a firewall issue. Here is the error I see in the console (both from command line and Eclipse):

ERROR 09:53:06,314  Logging.scala:75 -- Task 0 in stage 1.0 failed 4 times; aborting job
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
java.lang.NoSuchMethodError: org.apache.spark.util.Utils$.withDummyCallSite(Lorg/apache/spark/SparkContext;Lscala/Function0;)Ljava/lang/Object;
    at org.apache.spark.sql.parquet.ParquetRelation2.buildScan(newParquet.scala:269)
    at org.apache.spark.sql.sources.HadoopFsRelation.buildScan(interfaces.scala:530)
    at org.apache.spark.sql.sources.DataSourceStrategy$$anonfun$8.apply(DataSourceStrategy.scala:98)
    at org.apache.spark.sql.sources.DataSourceStrategy$$anonfun$8.apply(DataSourceStrategy.scala:98)
    at org.apache.spark.sql.sources.DataSourceStrategy$$anonfun$pruneFilterProject$1.apply(DataSourceStrategy.scala:266)
    at org.apache.spark.sql.sources.DataSourceStrategy$$anonfun$pruneFilterProject$1.apply(DataSourceStrategy.scala:265)
    at org.apache.spark.sql.sources.DataSourceStrategy$.pruneFilterProjectRaw(DataSourceStrategy.scala:296)
    at org.apache.spark.sql.sources.DataSourceStrategy$.pruneFilterProject(DataSourceStrategy.scala:261)
    at org.apache.spark.sql.sources.DataSourceStrategy$.apply(DataSourceStrategy.scala:94)
    at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$1.apply(QueryPlanner.scala:58)
    at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$1.apply(QueryPlanner.scala:58)
    at scala.collection.Iterator$$anon$13.hasNext(Iterator.scala:371)
    at org.apache.spark.sql.catalyst.planning.QueryPlanner.plan(QueryPlanner.scala:59)
    at org.apache.spark.sql.catalyst.planning.QueryPlanner.planLater(QueryPlanner.scala:54)
    at org.apache.spark.sql.execution.SparkStrategies$HashAggregation$.apply(SparkStrategies.scala:162)
    at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$1.apply(QueryPlanner.scala:58)
    at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$1.apply(QueryPlanner.scala:58)
    at scala.collection.Iterator$$anon$13.hasNext(Iterator.scala:371)
    at org.apache.spark.sql.catalyst.planning.QueryPlanner.plan(QueryPlanner.scala:59)
    at org.apache.spark.sql.SQLContext$QueryExecution.sparkPlan$lzycompute(SQLContext.scala:932)
    at org.apache.spark.sql.SQLContext$QueryExecution.sparkPlan(SQLContext.scala:930)
    at org.apache.spark.sql.SQLContext$QueryExecution.executedPlan$lzycompute(SQLContext.scala:936)
    at org.apache.spark.sql.SQLContext$QueryExecution.executedPlan(SQLContext.scala:936)
    at org.apache.spark.sql.DataFrame.collect(DataFrame.scala:1255)
    at org.apache.spark.sql.DataFrame.count(DataFrame.scala:1269)

(Log is truncated for brevity)

Thanks in advance for any pointers!

bbtus
  • 97
  • 1
  • 1
  • 10

1 Answers1

1

Please check how your class is resolved by maven using keys (CNTR+Shift+T). Make sure that it is not resolved from two different jars in your classpath.

If your class is referred from any of the transitive dependencies, add the required jar as a direct dependency using the version you need.

You can refer to these links for more reference.

mockito test gives no such method error when run as junit test but when jars are added manually in run confugurations, it runs well

Exception in thread "main" java.lang.NoSuchMethodError: org.slf4j.impl.StaticLoggerBinder.getSingleton()Lorg/slf4j/impl/StaticLoggerBinder

Community
  • 1
  • 1
asg
  • 2,248
  • 3
  • 18
  • 26
  • Thanks for the response. I double checked the contents of spark-core_2.10-1.4.0.jar (downloaded through the dependency I specified in pom.xlm), and withDummyCallSite function was indeed missing. I manually downloaded this jar from http://mvnrepository.com and replaced the jar, and the problem is now gone. It is the exact same version (2.10-1.4.0), not sure why the function was missing in the first place. – bbtus Nov 23 '15 at 17:32
  • No problem! It's good to hear that changing the jar manually solved your problem. Actually in this kind of situation, you can directly delete the folder structure of your dependency (E.g. C:/Users/{your username}/.m2/reposirty/org/apache/spark) from local cahce/repo. After deleting this jar, maven downloads the new copy from your remote repository and you can get new copy of the specified version in your local repo. – asg Nov 24 '15 at 04:55