My apology if somewhere I made a mistake in my language.
I want to install the Apache Livy server on a node(VM Instance) outside the Spark cluster. How can I do this so that LivyServer should point to the Spark cluster?
I have downloaded and installed livy on VM instance using
git clone https://github.com/cloudera/livy.git
cd livy
mvn clean package -DskipTests
made changes in livy/conf/livy.conf
livy.spark.master = spark://{spark-cluster-master_IP}:7077
livy.spark.deploy-mode = cluster
livy server started using command
livy/bin/livy-server start
And trying to interact using REST api of python
>>> import json, pprint, requests, textwrap
>>> host = 'http://localhost:8998'
>>> data = {'kind': 'spark'}
>>> headers = {'Content-Type': 'application/json'}
>>> r = requests.post(host + '/sessions', data=json.dumps(data), headers=headers)
>>> r.json()
{u'kind': u'spark', u'log': [], u'proxyUser': None, u'appInfo': {u'driverLogUrl': None, u'sparkUiUrl': None}, u'state': u'starting', u'appId': None, u'owner': None, u'id': 2}
>>> session_url = host + r.headers['location']
>>> r = requests.get(session_url, headers=headers)
>>> r.json()
{u'kind': u'spark', u'log': [], u'proxyUser': None, u'appInfo': {u'driverLogUrl': None, u'sparkUiUrl': None}, u'state': u'dead', u'appId': None, u'owner': None, u'id': 2}
Log file(livy/logs/livy-umesh-server.out) not showing anything about spark session dead
livyserver:~$ cat livy/logs/livy-umesh-server.out
log4j:WARN No appenders could be found for logger (com.cloudera.livy.LivyConf).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.