0

This is a simple Java code as a Spark job, mentioned in Spark job-server github repo

package com.sample.wordcount;

import com.typesafe.config.Config;
import com.typesafe.config.ConfigFactory;
import org.apache.spark.api.java.JavaSparkContext;
import spark.jobserver.japi.JSparkJob;
import spark.jobserver.api.JobEnvironment;

public class SparkJavaJob implements JSparkJob {
    @Override
    public Object run(Object sc, JobEnvironment runtime, Config data) {
        return "OK";
    }

    @Override
    public Config verify(Object sc, JobEnvironment runtime, Config config) {
        return ConfigFactory.empty();
    }
}

While submitting it to Spark Jobserver, it shows Job loading failed.

{ "status": "JOB LOADING FAILED", "result": { "message": "com.sample.wordcount.SparkJavaJob cannot be cast to spark.jobserver.api.SparkJobBase", "errorClass": "java.lang.ClassCastException" }

Can anyone help me out with this?

  • I don't know job-server, but it looks like SparkJavaJob should implement spark.jobserver.api.SparkJobBase, not spark.jobserver.japi.JSparkJob, no? Maybe some version mismatch somewhere... – jgp Oct 09 '21 at 14:02

1 Answers1

0

I was able to solve this issue. Basically I had to change the context type, while creating it.

curl -d '' localhost:8090/contexts/jcontext?context-factory=spark.jobserver.context.JavaSparkContextFactory&num-cpu-cores=2&memory-per-node=1g

We have to use "spark.jobserver.context.JavaSparkContextFactory"

  • As it’s currently written, your answer is unclear. Please [edit] to add additional details that will help others understand how this addresses the question asked. You can find more information on how to write good answers [in the help center](/help/how-to-answer). – Community Oct 11 '21 at 14:31