1

To elaborate, I am using livy to create a spark session and then I submit my jobs to the livy client which runs them in the same spark session. Now, if I need to add a new jar as a dependency in one of the jobs, is there any way to put the jar in the running spark session?

I have tried spark.jars, but it is only read while creating the session and not in an already running one.

Thanks

1 Answers1

0

I don't think so. I don't think you could do that in a spark-shell either. If you could the method for livy might become clear. Try Googling for a way to do it in spark-shell and see if you have better luck getting answers.

harschware
  • 13,006
  • 17
  • 55
  • 87