0

I am trying to follow this documentation:

https://github.com/spark-jobserver/spark-jobserver#dependency-jars

Option 2 Listed in the docs says:

The dependent-jar-uris can also be used in job configuration param when submitting a job. On an ad-hoc context this has the same effect as dependent-jar-uris context configuration param. On a persistent context the jars will be loaded for the current job and then for every job that will be executed on the persistent context. curl -d "" 'localhost:8090/contexts/test-context?num-cpu-cores=4&memory-per-node=512m' OK⏎ curl 'localhost:8090/jobs?appName=test&classPath=spark.jobserver.WordCountExample&context=test-context&sync=true' -d '{ dependent-jar-uris = ["file:///myjars/deps01.jar", "file:///myjars/deps02.jar"], input.string = "a b c a b see" }' The jars /myjars/deps01.jar & /myjars/deps02.jar (present only on the SJS node) will be loaded and made available for the Spark driver & executors.

Is "file:///myjars/" directory the SJS node's JAR directory or some custom directory?

I have a client on a Windows box and a Spark JobServer on a Linux box. Next, I upload a JAR to SJS node. SJS node puts that Jar somewhere. Then, when I call to start a Job and set the 'dependent-jar-uris', the SJS node will find my previously uploaded JAR and run the job:

"dependent-jar-uris" set to "file:///tmp/spark-jobserver/filedao/data/simpleJobxxxxxx.jar"

This works fine, but I had to manually go searching around the SJS node to find this location (e.g. file:///tmp/spark-jobserver/filedao/data/simpleJobxxxxxx.jar) and then add it into my future requests to start the job.

Instead, how to I make a REST call from the client to just get the path where Spark JobServer puts my jars when I uploaded them, so that I can set the file:/// path correctly in my 'dependent-jar-uris' property dynamically?

Jason
  • 2,006
  • 3
  • 21
  • 36

1 Answers1

0

I don't think uploaded jars using "POST /jars" can be used in dependent-jar-uris. Since you are uploading jars, you already know the local path. Just use that.

noorul
  • 1,283
  • 1
  • 8
  • 18
  • In my case, the job execution client does not have a reference to the JARs locally on the client. It expects them to be on the server already. If the Spark JobServer allows for separately uploading jars via a REST api, then how are they ever referenced later, if not via the 'dependent-jar-uris' ? – Jason Aug 12 '16 at 17:05
  • POST /jars is not for uploading dependent jars. It is for uploading spark app jar. I would recommend you to copy all your dependent jars to spark job server linux box and add them to dependent-jar-uris in spark jobsever conf itself. – noorul Aug 13 '16 at 07:21