I have started exploring spark 2 days back. So I am pretty new to it. My use case is around accessing a java function present in an external jar in my scala code which I am writing in spark-shell. But I think I am not loading my jar properly. Here is what I am doing
spark-shell --master local[2] --jars pathToJarsWithComma --conf="spark.driver.extraClassPath=pathToJarsWithColon" --conf="spark.executor.extraClassPath=pathToJarsWithColon"
This is how I launch my spark-shell with all the required jars being passed. And whenever I am trying to call the java static function like :
rdd1.collect.foreach(a=>MyClass.myfuncttion(a))
I am getting error as :
<console>:26: error: not found: value MyClass
I want to know if my understanding is correct. Can we use java functions in spark by loading external jars. If yes, then what I am doing wrong here. Please guide.