10

SPARK 2.3 is throwing following exception. Can anyone please help!! I tried adding the JARs

308 [Driver] ERROR org.apache.spark.deploy.yarn.ApplicationMaster - User class threw exception: java.lang.NoSuchMethodError: io.netty.buffer.PooledByteBufAllocator.metric()Lio/netty/buffer/PooledByteBufAllocatorMetric; java.lang.NoSuchMethodError: io.netty.buffer.PooledByteBufAllocator.metric()Lio/netty/buffer/PooledByteBufAllocatorMetric; at org.apache.spark.network.util.NettyMemoryMetrics.registerMetrics(NettyMemoryMetrics.java:80) at org.apache.spark.network.util.NettyMemoryMetrics.(NettyMemoryMetrics.java:76) at org.apache.spark.network.client.TransportClientFactory.(TransportClientFactory.java:109) at org.apache.spark.network.TransportContext.createClientFactory(TransportContext.java:99) at org.apache.spark.rpc.netty.NettyRpcEnv.(NettyRpcEnv.scala:71) at org.apache.spark.rpc.netty.NettyRpcEnvFactory.create(NettyRpcEnv.scala:461) at org.apache.spark.rpc.RpcEnv$.create(RpcEnv.scala:57) at org.apache.spark.SparkEnv$.create(SparkEnv.scala:249) at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:175) at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:256) at org.apache.spark.SparkContext.(SparkContext.scala:423) at org.apache.spark.api.java.JavaSparkContext.(JavaSparkContext.scala:58) at com.voicebase.etl.HBasePhoenixPerformance2.main(HBasePhoenixPerformance2.java:55) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$4.run(ApplicationMaster.scala:706) 315 [main] ERROR org.apache.spark.deploy.yarn.ApplicationMaster - Uncaught exception: org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:205) at org.apache.spark.deploy.yarn.ApplicationMaster.runDriver(ApplicationMaster.scala:486) at org.apache.spark.deploy.yarn.ApplicationMaster.org$apache$spark$deploy$yarn$ApplicationMaster$$runImpl(ApplicationMaster.scala:345) at org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$run$2.apply$mcV$sp(ApplicationMaster.scala:260) at org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$run$2.apply(ApplicationMaster.scala:260) at org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$run$2.apply(ApplicationMaster.scala:260) at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$5.run(ApplicationMaster.scala:800) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1836) at org.apache.spark.deploy.yarn.ApplicationMaster.doAsUser(ApplicationMaster.scala:799) at org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:259) at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:824) at org.apache.spark.deploy.yarn.ApplicationMaster.main(ApplicationMaster.scala) Caused by: java.util.concurrent.ExecutionException: Boxed Error

Alchemist
  • 849
  • 2
  • 10
  • 27
  • 1
    Added io.netty netty-all 4.1.17.Final io.netty netty 3.9.9.Final – Alchemist May 17 '18 at 10:16
  • NOT SURE WHAT IS CAUSING THIS ERROR ADDED PROPER JAR NO EFFECT. ANY HELP?? – Alchemist May 17 '18 at 10:16
  • 3
    I have the same here, any solution. Incredible Spark dependency hell! – Joan Jun 21 '18 at 15:35
  • Possible duplicate of [Spark 2.3.0 netty version issue: NoSuchMethod io.netty.buffer.PooledByteBufAllocator.metric()](https://stackoverflow.com/questions/49137397/spark-2-3-0-netty-version-issue-nosuchmethod-io-netty-buffer-pooledbytebufalloc) – eshizhan Jul 03 '18 at 07:25
  • @Alchemist are you able to solve the issue because i am also facing the same. – Anubhav Jain Jan 01 '19 at 10:43
  • excluding netty-all & netty from spark dependency and adding it directly worked for me. I used the version netty-all - 4.1.17.Final netty - 3.9.9.Final It looks like a conflict in jars – dasrohith Oct 14 '20 at 12:41

4 Answers4

13

This is because Hadoop binaries compiled with an older version and need us to just replace them. I haven't faced any issues with Hadoop by replacing them.

You need to replace netty-3.6.2.Final.jar and netty-all-4.0.23.Final.jar from path $HADOOP_HOME\share\hadoop with netty-all-4.1.17.Final.jar and netty-3.9.9.Final.jar.

Xavier Guihot
  • 54,987
  • 21
  • 291
  • 190
Suhas Kolaskar
  • 131
  • 1
  • 3
  • really. I took those jars from spark installation and repleced those in hadoop and it worked.. – Vladimir Nabokov Aug 12 '19 at 11:09
  • Yes it worked. Strangely the [docs](https://netty.io/4.0/api/io/netty/buffer/PooledByteBufAllocator.html) show that it is part of 4.0 but seems like it only got included after 4.1. – TroubleShooter Jan 02 '20 at 16:10
2

This issue plagues due to mismatch of the version that Hadoop and Spark are compiled on for Netty. So you can follow this.

Similar Issue , solved by manually compiling the Spark by using specific version of Netty

And the other one as recommended by Suhas , by copying the content of SPARK_HOME/jars folder to the various lib folder or only the one in yarn inside HADOOP_HOME/share/hadoop solves the problem also. But it's a dirty fix. So maybe use latest version of both or manually compile them.

2

An older version of Netty was required by the aws-java-sdk. Deleting all the netty jars and removing the aws-java-sdk from the project solved the problem.

double-beep
  • 5,031
  • 17
  • 33
  • 41
Logister
  • 1,852
  • 23
  • 26
2

Issue has been resolved by adding the below netty jars in the dependencies,

   "io.netty" % "netty-all" % "4.1.68.Final"
   "io.netty" % "netty-buffer" % "4.1.68.Final"

And excluding all existing netty jars by adding excludeAll code.

 val excludeNettyBufferBinding = ExclusionRule(organization = "io.netty.buffer")
excludeAll(excludeNettyBufferBinding)