1

I have been working with a standalone Spark Server with Jobserver. For x reasons I had to migrate to an Ambari Cluster and then as already have Livy I think is better to use it instead of Jobserver.

Now I'm lost trying to migrate my actual Java Jobserver code to Livy. I have read the Livy documentation and this is what I have found:

In Livy you can execute batches which somehow are like the Ad-Hoc mode in the Jobserver. And there are Sessions, where a single Spark Context per session can be used to execute statements. Am I right?

  • Is there a way to execute a batch or a statement in a way that can be synchronous like in the Jobserver, so with a single request you get the result? Actually I can only see an asynchronous mode.
  • In the Jobserver you can reference the java class where your functionality is implemented and this one only need to implement the class JSqlJob<String> then the job server knows it have to execute the run() method of this class. But in Livy I have tried to execute the Pi calculation, but it fails.

Im using the following curl: curl -X POST -d '{ "file": "hdfs://a1:8020/user/value_iq/Livy_Pi_Example-1.0-SNAPSHOT-dep.jar", "className": "value_iq.viq.PiJob","proxyUser": "hdfs", "name": "Livy Pi Example", "conf": {"executorCores":1, "executorMemory":"2G", "driverCores":1, "driverMemory":"512m", "queue":"uno"}, "args" : ["100"] }' -H "Content-Type: application/json" -H "X-Requested-By: admin" http://ambari.value-iq.com:8999/batches

java.lang.NoSuchMethodException: value_iq.viq.PiJob.main([Ljava.lang.String;)
    at java.lang.Class.getMethod(Class.java:1786)
    at org.apache.spark.deploy.yarn.ApplicationMaster.startUserApplication(ApplicationMaster.scala:641)
    at org.apache.spark.deploy.yarn.ApplicationMaster.runDriver(ApplicationMaster.scala:416)
    at org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:282)
    at org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$main$1.apply$mcV$sp(ApplicationMaster.scala:768)
    at org.apache.spark.deploy.SparkHadoopUtil$$anon$2.run(SparkHadoopUtil.scala:67)
    at org.apache.spark.deploy.SparkHadoopUtil$$anon$2.run(SparkHadoopUtil.scala:66)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:422)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1869)
    at org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:66)
    at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:766)
    at org.apache.spark.deploy.yarn.ApplicationMaster.main(ApplicationMaster.scala)

What do I need to implement in such a main method?

This is my java class:

package value_iq.viq;

import java.util.*;

import org.apache.spark.api.java.function.*;

import org.apache.livy.*;

public class PiJob implements Job<Double>, Function<Integer, Integer>,
  Function2<Integer, Integer, Integer> {

  private final int samples;

  public PiJob(int samples) {
    this.samples = samples;
  }

  @Override
  public Double call(JobContext ctx) throws Exception {
    List<Integer> sampleList = new ArrayList<Integer>();
    for (int i = 0; i < samples; i++) {
      sampleList.add(i + 1);
    }

    Integer max = ctx.sqlctx().sql("select max(id) from livy_test").takeAsList(1).get(0).getInt(0) + 1;

    ctx.sqlctx().sql("insert into livy_test values("+max+", now())");

    return 4.0d * ctx.sc().parallelize(sampleList).map(this).reduce(this) / samples;
  }

  @Override
  public Integer call(Integer v1) {
    double x = Math.random();
    double y = Math.random();
    return (x*x + y*y < 1) ? 1 : 0;
  }

  @Override
  public Integer call(Integer v1, Integer v2) {
    return v1 + v2;
  }

I got this class from https://livy.incubator.apache.org/docs/latest/programmatic-api.html and it says: To submit this code using Livy, create a LivyClient instance. Do I have to create then another jar file in order to upload the one with the Pi Job??? Can't I do this just uploading the jar and specify the java class in the curl like I'm trying?

0 Answers0