2

We are using Apache Livy 0.6.0-incubating and using its REST API to make calls to custom spark jar using /batches/ API.

The custom spark code reads data from HDFS and does some processing. This code is successful and the REST response is also 'SUCCESS'. We want the data to be returned back to the client, the way /sessions/ API returns the data. Is there a way to do this?

Note: /sessions/ API can only accept spark scala code.

Mata
  • 439
  • 2
  • 10

1 Answers1

1

I have some similiar set up, the way I return the data is by writing the spark result to HDFS. and when I recieve a SUCCESS I read from the client machine the HDFS to get the result.

Ilya Brodezki
  • 336
  • 2
  • 15
  • 1
    Thanks, this is the alternative we are also following right now, but wondering if there is any way Livy can handle this with a hook. – Mata Jul 15 '19 at 09:14