Questions tagged [spark-jobserver]

spark-jobserver provides a RESTful interface for submitting and managing Apache Spark jobs, jars, and job contexts.

Reference: https://github.com/spark-jobserver/spark-jobserver

RealTime Example: https://nishutayaltech.blogspot.com/2016/05/how-to-run-spark-job-server-and-spark.html

165 questions
0
votes
0 answers

Spark Jobserver Executor application finished with state KILLED exitStatus 0

Using Apache Spark 2.0.1 deployed as Standalone together with jobserver 0.7.0. I have a small job to test if the context is operational due sometimes the contexts is killed but the java process on my server is still alive. So I double check if the…
0
votes
1 answer

Spark JobServer JobEnvironment

def main(args: Array[String]) { val conf = new SparkConf().setMaster("local[4]").setAppName("LongPiJob") val sc = new SparkContext(conf) val env = new JobEnvironment { def jobId: String = "abcdef" //scalastyle:off def namedObjects:…
ozzieisaacs
  • 833
  • 2
  • 11
  • 23
0
votes
1 answer

Persisting Spark Jobserver NamedObjects using Java

Using Spark Jobserver 0.6.2 and Apache Spark 2.0.2, I have already some functionalities implemented. But I can't find how to persist a Dataset to be shared across diferents jobs on a specific context in java. It's this functionality only available…
0
votes
1 answer

FiloDB + Spark Streaming Data Loss

I'm using FiloDB 0.4 with Cassandra 2.2.5 column and meta store and trying to insert data into it using Spark Streaming 1.6.1 + Jobserver 0.6.2. I use the following code to insert data: messages.foreachRDD(parseAndSaveToFiloDb) private static…
I V
  • 195
  • 8
0
votes
1 answer

Spark JobServer: graphx VertexRDD java.lang.ClassNotFoundException

I am developing a SparkJob on jobserver (v0.6.2 spark 1.6.1) using spark graphx and I am running into the following exception when trying to launch my job on Spark JobServer: { "status": "JOB LOADING FAILED", "result": { "errorClass":…
zaki benz
  • 672
  • 7
  • 21
0
votes
1 answer

Spark job-server show logs

I've set up a spark job-server (see https://github.com/spark-jobserver/spark-jobserver/tree/jobserver-0.6.2-spark-1.6.1) in standalone mode. I've set up some jobs using Scala, every job use the same shared context, but I don't understand how to…
Marco Fedele
  • 2,090
  • 2
  • 25
  • 45
0
votes
1 answer

Using Spark JobServer spark doesn't use the configured mysql connection on hive-site.xml

Working with Spark 2.0.2 I have a jar which works fine with spark-submit. Now I wanna use it from Spark JobServer. The first problem was that the methods: public SparkJobValidation validate(SparkContext sc, Config config) { return…
0
votes
1 answer

Maven repository for spark-jobserver 0.7.0

I'm setting up a maven java project to implement SparkJobs in a spark-jobserver. In the GitHub Spark JobServer page they mention the new 0.7.0 version, but on the maven repository that they provide I can't find it.
0
votes
1 answer

Error starting spark-jobserver with Apache Spark 2.0.2

I'm trying to start the spark-jobserver. I can't find any reference to this akka library on the installation steps provided on the GitHub spark_jobserver page. I'm running spark in standalone mode on a single server which act as a master/node. But…
0
votes
1 answer

spark-jobserver cannot build on Spark 1.6.2

I'm trying to run the spark-jobserver 0.6.2 with Spark 1.6.2 Currently what I'm doing is this: git clone https://github.com/spark-jobserver/spark-jobserver.git git checkout tags/v0.6.2 -f sbt job-server/package At this point the system crashes…
Marco Fedele
  • 2,090
  • 2
  • 25
  • 45
0
votes
1 answer

sparkjobserver adds a [ in front of every { and [

I am using sparkjobserver restful service. Everything works fine except the returned json string has an extra [] around every object and array. Each array becomes [[.......]]. and each object becomes [{.....}] has anyone seen this problem before?…
bhomass
  • 3,414
  • 8
  • 45
  • 75
0
votes
1 answer

where to see sparkjobserver main console output?

When running spark-submit, you can see the println statements right in the shell. When submitting a spark job to the sparkjobserver, I can't find where the stdout messages go. Anyone knows that?
bhomass
  • 3,414
  • 8
  • 45
  • 75
0
votes
2 answers

How to use NamedDataFrame from spark job server

I used SJS for my project and would like to know how NamedDataFrame from SJS works. My first program does this val schemaString = "parm1:int,parm2:string,parm3:string,parm4:string,parm5:int,parm6:string,parm7:int,parm8:int" val schema =…
user1384205
  • 1,231
  • 3
  • 20
  • 39
0
votes
1 answer

Scala Runtime errors calling program on Spark Job Server

I used spark 1.6.2 and Scala 11.8 to compile my project. The generated uber jar with dependencies is placed inside Spark Job Server (that seems to use Scala 10.4 (SCALA_VERSION=2.10.4 specified in .sh file) There is no problem in starting the…
user1384205
  • 1,231
  • 3
  • 20
  • 39
0
votes
1 answer

Persistence with NamedObjects in Spark Job Server

Im using the latest SJS version (master) and the application extends SparkHiveJob. In the runJob implementation, I have the following val eDF1 = hive.applySchema(rowRDD1, schema) I would like to persist eDF1 and tried the following val…
user1384205
  • 1,231
  • 3
  • 20
  • 39