8

I just upgraded my spark project from 2.2.1 to 2.3.0 to find the versioning exception below. I have dependencies on the spark-cassandra-connector.2.0.7 and cassandra-driver-core.3.4.0 from datastax which in turn have dependencies on netty 4.x whereas spark 2.3.0 uses 3.9.x.

The class raising the exception, org.apache.spark.network.util.NettyMemoryMetrics, has been introduced in spark 2.3.0.

Is downgrading my Cassandra dependencies the only way round the exception? Thanks!

Exception in thread "main" java.lang.NoSuchMethodError: io.netty.buffer.PooledByteBufAllocator.metric()Lio/netty/buffer/PooledByteBufAllocatorMetric;
at org.apache.spark.network.util.NettyMemoryMetrics.registerMetrics(NettyMemoryMetrics.java:80)
at org.apache.spark.network.util.NettyMemoryMetrics.<init>(NettyMemoryMetrics.java:76)
at org.apache.spark.network.client.TransportClientFactory.<init>(TransportClientFactory.java:109)
at org.apache.spark.network.TransportContext.createClientFactory(TransportContext.java:99)
at org.apache.spark.rpc.netty.NettyRpcEnv.<init>(NettyRpcEnv.scala:71)
at org.apache.spark.rpc.netty.NettyRpcEnvFactory.create(NettyRpcEnv.scala:461)
at org.apache.spark.rpc.RpcEnv$.create(RpcEnv.scala:57)
at org.apache.spark.SparkEnv$.create(SparkEnv.scala:249)
at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:175)
at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:256)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:423)
rodders
  • 229
  • 1
  • 6
  • 9

2 Answers2

18

It seems like you use an "too old" netty 4 version. Maybe you have multiple on your classpath ? It should be not problem to have netty 4.x and 3.x on the classpath.

Norman Maurer
  • 23,104
  • 2
  • 33
  • 31
  • Thanks you got on the right direction. I come from the .Net world and Scala is still a bit new to me. I eventually got it fixed following this: https://maven.apache.org/guides/introduction/introduction-to-dependency-mechanism.html – rodders Mar 08 '18 at 10:53
  • 1
    @rodders can you tell us, how did you fixed it? – M. Alexandru Apr 26 '18 at 09:58
  • 4
    I got by adding this fragment to my pom, and thus forcing a common netty reference: ` io.netty netty-all 4.1.17.Final ` – rodders Apr 26 '18 at 13:28
  • @rodders I am also facing the same issue but in my case i am using pyspark(v 2.3.2) and hadoop (v2.8.3) . When i submit a pyspark job using spark-submit and open the logs of the yarn container i get the similar error. Due you have any idea how to solve it in case of pyspark – Anubhav Jain Jan 16 '19 at 15:47
  • I had same problem and after excluding netty from one of the dependency ` com.datastax.spark spark-cassandra-connector_2.11 ${spark-cassandra-connector_2.11.version} io.netty netty-all ` it works for me – Joey Trang May 31 '19 at 23:02
6

I would like to add some more details to the answer for ease of work, just run mvn dependency:tree -Dverbose -Dincludes=io.netty:netty-all it will return all the dependencies using io.netty and its version. In my case the culprit was Hive Jdbc 2.1.0 which has netty-all of version lower than the version used by spark 2.3.1 so the classpath omits to load the spark's netty as it was already loaded from hive-jdbc.

So the fix is to exclude the dependencies from the Hive-Jdbc in pom.xml

Pyd
  • 6,017
  • 18
  • 52
  • 109