0

I am trying to run a simple script with spark, and it is giving me

java.lang.NoSuchMethodError: io.netty.buffer.PooledByteBufAllocator.metric()Lio/netty/buffer/PooledByteBufAllocatorMetric;

I saw on this thread: Spark 2.3.0 netty version issue: NoSuchMethod io.netty.buffer.PooledByteBufAllocator.metric()

But even after I added the newer netty module to my pom.xml I am still seeing it. My pom.xml looks like this:

 <dependency>
   <groupId>org.apache.spark</groupId>
   <artifactId>spark-core_2.11</artifactId>
   <version>2.3.0</version>
 </dependency>
 <dependency>
   <groupId>io.netty</groupId>
   <artifactId>netty-all</artifactId>
   <version>4.1.17.Final</version>
 </dependency>

I have also tried using spark 2.2.1, but that that gives me:

java.lang.IllegalArgumentException: requirement failed: Can only call getServletHandlers on a running MetricsSystem

So I am a bit stuck, is there any other configuration that I can try?

ThunderStruct
  • 1,504
  • 6
  • 23
  • 32
TurningLock
  • 21
  • 1
  • 5
  • With `netty-all 4.1.17.Final` in your `pom.xml` you should not seeing the `NoSuchMethodError`. Maybe you can run `mvn clean compile` and try again. – John Jun 02 '18 at 13:59
  • You are right, this error has gone away on its own. I am now seeing a different error, thanks! – TurningLock Jun 05 '18 at 00:51

1 Answers1

0

So this may be of help. First though, I'm not a big java guy and this is my first foray into Cassandra/spark. I think they fixed it from their source downloads, because that managed dependency is now added in 2.3.0 (I swear it was not like a day ago I might be crazy). I was also trying to use the cassandra connector and the inclusion of those jars may have been causing my issue. In any case, this is an excerpt from my docker file:

#SPARK
ENV SPARK_COMMIT 992447fb30ee9ebb3cf794f2d06f4d63a2d792db
ENV SPARK_VERSION 2.3.0
ENV SPARK_PREFIX /usr
ENV SPARK_HOME /usr/spark-${SPARK_VERSION}
ENV _R_CHECK_FORCE_SUGGESTS_ false
RUN wget http://apache.claz.org/spark/spark-2.3.0/spark-2.3.0.tgz && tar -xzf spark-2.3.0.tgz -C $SPARK_PREFIX

# COPY ./builds/secondpage-spark/pom.xml $SPARK_HOME/pom.xml

RUN cd $SPARK_HOME && ./dev/make-distribution.sh --name custom-spark --pip --tgz -Phive -Phive-thriftserver -Pyarn -Pkubernetes -DskipTests

# SPARK CASSANDRA CONNECTOR
ENV CONNECTOR_COMMIT d8a3eb4
ENV CONNECTOR_HOME /usr/spark-cassandra-connector
RUN git clone https://github.com/datastax/spark-cassandra-connector $CONNECTOR_HOME && \
cd $CONNECTOR_HOME && git checkout $CONNECTOR_COMMIT && \
sbt/sbt doc && \
sbt/sbt package && \
sbt/sbt assembly

There's a bunch of extraneous stuff in there cause I been throwing things at the wall and this is the first thing to stick. So this could stand to be cleaned, or you can convert it to pure bash, whatever ya want