1

I'm using Livy on HDInsight to submit jobs to a Spark cluster. I have my code written and compiled into a JAR, but it has multiple dependencies, some of which are from a custom repository.

How do I get Livy to resolve these dependencies by itself? I dont want to submit a fat jar because the dependencies are changing always and I dont like the ops effort involved.

SiddharthaRT
  • 2,217
  • 4
  • 20
  • 28

1 Answers1

1

You may want to pass spark.jars.ivy as a parameter. Refer to https://spark.apache.org/docs/latest/configuration.html

You can pass anything that Livy supports in the /batches POST body: https://github.com/cloudera/livy#post-batches

aggFTW
  • 426
  • 3
  • 12
  • Will this resolve dependencies? i. e. If I mention com.X.Y 0.3, will it also add all the dependencies of com.X.Y mentioned in the pom.xml in the repository? – SiddharthaRT May 02 '17 at 09:17
  • It will behave exactly as spark behaves. Try a spark-submit job to figure out which parameters you need to pass, and then replicate with Livy. – aggFTW May 03 '17 at 20:47