1

Is there a way, we can execute Spark code(package in jar) from Nifi, using Livy?

I can see in Nifi that using ExecuteSparkInteractive, we can submit custom code which can be run in spark cluster using livy. but what i want is , pass name of the jar file , main class in Nifi that connects Spark via Livy.

I see below article on this, but seems option like Session JARs is not available in plain Nifi installation.

https://community.cloudera.com/t5/Community-Articles/Apache-Livy-Apache-NiFi-Apache-Spark-Executing-Scala-Classes/ta-p/247985

GPopat
  • 445
  • 4
  • 14
  • Well via the rest API it should certainly be possible – Georg Heiler Oct 27 '20 at 11:09
  • @GeorgHeiler Could you please elaborate a bit more ? – GPopat Oct 27 '20 at 15:14
  • Well NiFi can execute arbitrary code via bash/python scripts and I believe has also a processor to perform REST calls. You would need to 1) via HDFS web need to make sure to upload the JAR to a location accessible to Livy 2) use the REST API of Livi to start the job. – Georg Heiler Oct 27 '20 at 15:16
  • But apparently: https://nifi.apache.org/docs/nifi-docs/components/org.apache.nifi/nifi-livy-nar/1.6.0/org.apache.nifi.controller.livy.LivySessionController/ is also available in a more natively integrated format. Have you tried that one already? – Georg Heiler Oct 27 '20 at 15:16
  • Yes, I saw that with respect to Nifi. We have to embed spark code within Nifi, which is not the case in my situation. – GPopat Oct 27 '20 at 15:27
  • The link to the nifi controller service would contain a JAR parameter. But what do you mean with integrated? I thought that the controller service is natively integrated into NiFi - what would you want to have instead/more? – Georg Heiler Oct 27 '20 at 15:29

0 Answers0