1

I have cloned DL4J examples and just trying to run one of them. One that I am trying is LogDataExample.java. Project has been build successfully and everyting seams fine expect when starting it following exception is thrown

Exception in thread "main" java.lang.NoSuchMethodError: io.netty.util.concurrent.SingleThreadEventExecutor.<init>(Lio/netty/util/concurrent/EventExecutorGroup;Ljava/util/concurrent/Executor;ZLjava/util/Queue;Lio/netty/util/concurrent/RejectedExecutionHandler;)V
    at io.netty.channel.SingleThreadEventLoop.<init>(SingleThreadEventLoop.java:65)
    at io.netty.channel.nio.NioEventLoop.<init>(NioEventLoop.java:138)
    at io.netty.channel.nio.NioEventLoopGroup.newChild(NioEventLoopGroup.java:138)
    at io.netty.channel.nio.NioEventLoopGroup.newChild(NioEventLoopGroup.java:37)
    at io.netty.util.concurrent.MultithreadEventExecutorGroup.<init>(MultithreadEventExecutorGroup.java:84)
    at io.netty.util.concurrent.MultithreadEventExecutorGroup.<init>(MultithreadEventExecutorGroup.java:58)
    at io.netty.util.concurrent.MultithreadEventExecutorGroup.<init>(MultithreadEventExecutorGroup.java:47)
    at io.netty.channel.MultithreadEventLoopGroup.<init>(MultithreadEventLoopGroup.java:59)
    at io.netty.channel.nio.NioEventLoopGroup.<init>(NioEventLoopGroup.java:78)
    at io.netty.channel.nio.NioEventLoopGroup.<init>(NioEventLoopGroup.java:73)
    at io.netty.channel.nio.NioEventLoopGroup.<init>(NioEventLoopGroup.java:60)
    at org.apache.spark.network.util.NettyUtils.createEventLoop(NettyUtils.java:50)
    at org.apache.spark.network.client.TransportClientFactory.<init>(TransportClientFactory.java:102)
    at org.apache.spark.network.TransportContext.createClientFactory(TransportContext.java:99)
    at org.apache.spark.rpc.netty.NettyRpcEnv.<init>(NettyRpcEnv.scala:71)
    at org.apache.spark.rpc.netty.NettyRpcEnvFactory.create(NettyRpcEnv.scala:461)
    at org.apache.spark.rpc.RpcEnv$.create(RpcEnv.scala:57)
    at org.apache.spark.SparkEnv$.create(SparkEnv.scala:249)
    at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:175)
    at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:257)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:424)
    at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
    at org.datavec.transform.logdata.LogDataExample.main(LogDataExample.java:85)

I was not able to find anything online that would help me fix this. My code is exactly the same as in the example

pom.xml contains following

<dependency>
    <groupId>io.netty</groupId>
    <artifactId>netty-all</artifactId>
    <version>4.1.46.Final</version>
</dependency>
mirzak
  • 1,043
  • 4
  • 15
  • 30

1 Answers1

1

I think you are forcing a newer version of netty than Spark supports.

By running mvn dependency:tree you can see what version Spark wants here, and use that instead of the one you've defined.

If you don't care about Spark, but want to just use DataVec to transform your data, take a look at https://www.dubs.tech/guides/quickstart-with-dl4j/. It is a little bit outdated concerning the dependencies, but the datavec part shows how to use it without spark.

Paul Dubs
  • 798
  • 4
  • 8
  • Running it I get the following. I am new to maven, I am not sure what does this tell me. https://justpaste.it/3iqdh – mirzak Mar 02 '20 at 19:56
  • 1
    Made it work buy just trying out different versions, 4.1.33.Final being the one that works. Thank you for the hint. – mirzak Mar 02 '20 at 20:04