Questions tagged [livy]

Apache Livy is a service that enables easy interaction with a Spark cluster over a REST interface

From http://livy.incubator.apache.org.

What is Apache Livy?

Apache Livy is a service that enables easy interaction with a Spark cluster over a REST interface. It enables easy submission of Spark jobs or snippets of Spark code, synchronous or asynchronous result retrieval, as well as Spark Context management, all via a simple REST interface or a RPC client library. Apache Livy also simplifies the interaction between Spark from application servers, thus enabling the use of Spark for interactive web/mobile applications. Additional features include:

  • Have long running Spark Contexts that can be used for multiple Spark jobs, by multiple clients
  • Share cached RDDs or Dataframes across multiple jobs and clients
  • Multiple Spark Contexts can be managed simultaneously, and the Spark Contexts run on the cluster (YARN/Mesos) instead of the Livy Server, for good fault tolerance and concurrency
  • Jobs can be submitted as precompiled jars, snippets of code or via java/scala client API
  • Ensure security via secure authenticated communication

References

288 questions
0
votes
0 answers

Unable to connect External Jar in Livy

We are trying to build a setup where we have a server that submits jobs of different users to the Livy server via the REST API. And we have submitted the jar in hdfs and calling that from livy client. There is demo code,where it is calling…
0
votes
1 answer

There's no zeppelin.livy.principal in Ambari Zeppelin Service configuration with HDP 3 stack installed

According to this doc, i should find zeppelin.livy.principal in config of zeppelin or spark2, but there's no such option there.
0
votes
0 answers

spark kafka streaming code not working on livy

I've been working in some scala codes to run spark streaming for kafka data acquisition. Once updated all dependencies and adding new ones to the compiled jar, I am able to to run it with spark-submit with the code spark-submit…
sartions
  • 113
  • 1
  • 4
  • 15
0
votes
0 answers

Zeppelin - Spark Interpreter Unable to create hive table by using CTAS (Create Table as Select ...) statement

I am using Zeppelin and trying to create a hive table from another hive table by using CTAS statement But my query ends up with error always so the table is not getting created. Have found out few posts which says to modify zeppelin configuration…
JKC
  • 2,498
  • 6
  • 30
  • 56
0
votes
2 answers

Zeppelin, Livy, Can I get the proxyUser

I am trying to obtain the userID used to log in to Zeppelin within my scala (%livy) script. I've tried searching online and noted that there is a property named "proxyUser". However, I can not work out how to get this property within my Scala…
GMc
  • 1,764
  • 1
  • 8
  • 26
0
votes
1 answer

Running a high volume of Hive queries from PySpark

I want execute a very large amount of hive queries and store the result in a dataframe. I have a very large dataset structured like this: +-------------------+-------------------+---------+--------+--------+ | visid_high| …
Tom Rijntjes
  • 614
  • 4
  • 16
0
votes
1 answer

Apache Nifi - Submitting Spark batch jobs through Apache Livy

I want to schedule my spark batch jobs from Nifi. I can see there is ExecuteSparkInteractive processor which submit spark jobs to Livy, but it executes the code provided in the property or from the content of the incoming flow file. How should I…
Apurba Pandey
  • 1,061
  • 10
  • 21
0
votes
1 answer

Unable to submit Pyspark code via ExecuteSparkInteractive processor in Apache NiFi

I am new to Python and Apache ecosystem. I am trying to submit Pyspark code via ExecuteSparkInteractive processor in Apache NiFi. I do not have detailed knowledge of any of the components being used here, I am only doing Googling and…
0
votes
1 answer

zeppelin dynamically load jars

Inside Zeppelin I want to be able to dynamically load the jars to Livy from a corporate repository. livy.spark.jars.packages only applies to the interpreter configuration which is restricted due to security constraints. How can I once setup the…
Georg Heiler
  • 16,916
  • 36
  • 162
  • 292
0
votes
3 answers

Apache Livy cURL not working for spark-submit command

I recently started working with Spark Scala, HDFS, sbt and Livy. Currently I tried to create livy batch. Warning: Skip remote jar hdfs://localhost:9001/jar/project.jar. java.lang.ClassNotFoundException: SimpleApp at…
Divya Arya
  • 439
  • 5
  • 22
0
votes
1 answer

How to keep or check Apache Livy connection?

As we know create Apache Livy connection is expensive. It will create new applications and upload task files. My case is user can submit job use my web Api write with Java, then i use Apache Livy Client to submit job to spark. I want to keep one or…
Moon.Hou
  • 45
  • 1
  • 8
0
votes
1 answer

Pass value to livy from python

I want to pass value to livy code from python. But the value I am passing is changing after each call but the value that goes to livy remains same. data_while_loop = { 'code': textwrap.dedent(""" user_data_dict = """ + str(user_ver_dict) +…
Affan
  • 85
  • 2
  • 9
0
votes
1 answer

REST api for apache spark

I have a .py file that contains a machine learning with apache spark using mongodb and i want to connect the results of my code to an android application, is this a way to do that with a REST api , I heared about tensorflow and livy!
betty bth
  • 33
  • 7
0
votes
0 answers

unable to successfully submit a job through Livy in cluster mode

I have the following but I am unable to successfully submit a job through Livy in cluster mode. here are my settings spark-defaults.conf spark.master yarn livy.conf livy.spark.master=yarn livy.spark.deploy-mode = cluster livy.server.recovery.mode =…
user1870400
  • 6,028
  • 13
  • 54
  • 115
0
votes
0 answers

Submit python code with Livy

how I can submit a code pyspark with Livy? I used this and it works curl localhost:8998/sessions/0/statements -X POST -H 'Content-Type: application/json' -d '{"code":"srdd.get(\"ak\")"}' But now i would to pass a command, example this curl…