3

I have a livy server running in my local and it is connecting to a remote yarn cluster for running spark jobs.

I am getting the below error when i upload my jar from Livy programmatic Job api.

It looks like correct netty channel handler is not getting mapped for the request message. because of this spark context is not getting created.

livy.conf setting:

livy.spark.master = yarn
livy.spark.deploy-mode = cluster

I am able to do post to Livy-REST /batches api and it submits and completes the job in the yarn cluster successfully.

With the same config, when i am trying to upload jar from my java client it fails with the below error.

Livy version : 0.4.0-incubating.

HTTP Client:

<dependency>
    <groupId>org.apache.livy</groupId>
    <artifactId>livy-client-http</artifactId>
    <version>0.4.0-incubating</version>
</dependency>

Error:

17/09/22 22:47:50 DEBUG RSCDriver: Registered new connection from [id: 0x46d83447, L:/127.0.0.1:10000 - R:/127.0.0.1:9273].
17/09/22 22:47:50 DEBUG RpcDispatcher: [ReplDriver] Registered outstanding rpc 0 (org.apache.livy.rsc.BaseProtocol$RemoteDriverAddress).
17/09/22 22:47:50 DEBUG KryoMessageCodec: Encoded message of type org.apache.livy.rsc.rpc.Rpc$MessageHeader (5 bytes)
17/09/22 22:47:50 DEBUG KryoMessageCodec: Encoded message of type org.apache.livy.rsc.BaseProtocol$RemoteDriverAddress (90 bytes)
17/09/22 22:47:50 DEBUG KryoMessageCodec: Decoded message of type org.apache.livy.rsc.rpc.Rpc$MessageHeader (5 bytes)
17/09/22 22:47:50 DEBUG KryoMessageCodec: Decoded message of type org.apache.livy.rsc.BaseProtocol$RemoteDriverAddress (90 bytes)
17/09/22 22:47:50 DEBUG RpcDispatcher: [ReplDriver] Received RPC message: type=CALL id=0 payload=org.apache.livy.rsc.BaseProtocol$RemoteDriverAddress
17/09/22 22:47:50 WARN RpcDispatcher: [ReplDriver] Failed to find handler for msg 'org.apache.livy.rsc.BaseProtocol$RemoteDriverAddress'.
17/09/22 22:47:50 DEBUG RpcDispatcher: [ReplDriver] Caught exception in channel pipeline.
java.lang.NullPointerException
at org.apache.livy.rsc.Utils.stackTraceAsString(Utils.java:95)
at org.apache.livy.rsc.rpc.RpcDispatcher.handleCall(RpcDispatcher.java:121)
at org.apache.livy.rsc.rpc.RpcDispatcher.channelRead0(RpcDispatcher.java:77)
at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
at io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:86)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:293)
at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:267)
at io.netty.handler.codec.ByteToMessageCodec.channelRead(ByteToMessageCodec.java:103)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1294)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:911)
at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:643)
at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:566)
at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:480)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:442)
at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:131)
at java.lang.Thread.run(Thread.java:745)
17/09/22 22:47:50 WARN RpcDispatcher: [ReplDriver] Closing RPC channel with 1 outstanding RPCs.
17/09/22 22:47:50 WARN RpcDispatcher: [ReplDriver] Closing RPC channel with 1 outstanding RPCs.
17/09/22 22:47:50 ERROR ApplicationMaster: User class threw exception: java.util.concurrent.CancellationException
java.util.concurrent.CancellationException
at io.netty.util.concurrent.DefaultPromise.cancel(...)(Unknown Source)
17/09/22 22:47:50 INFO ApplicationMaster: Final app status: FAILED, exitCode: 15, (reason: User class threw exception: java.util.concurrent.CancellationException)
17/09/22 22:47:50 ERROR ApplicationMaster: Uncaught exception: 
org.apache.spark.SparkException: Exception thrown in awaitResult: 
at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:205)
at org.apache.spark.deploy.yarn.ApplicationMaster.runDriver(ApplicationMaster.scala:401)
at org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:254)
at org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$main$1.apply$mcV$sp(ApplicationMaster.scala:764)
at org.apache.spark.deploy.SparkHadoopUtil$$anon$2.run(SparkHadoopUtil.scala:67)
at org.apache.spark.deploy.SparkHadoopUtil$$anon$2.run(SparkHadoopUtil.scala:66)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698)
at org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:66)
at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:762)
at org.apache.spark.deploy.yarn.ApplicationMaster.main(ApplicationMaster.scala)
Caused by: java.util.concurrent.CancellationException
at io.netty.util.concurrent.DefaultPromise.cancel(...)(Unknown Source)
17/09/22 22:47:50 INFO ApplicationMaster: Unregistering ApplicationMaster with FAILED (diag message: User class threw exception: java.util.concurrent.CancellationException)
17/09/22 22:47:50 INFO ShutdownHookManager: Shutdown hook called.       

0 Answers0