I am using pyspark
to run an application on a cluster in client
mode using standalone
for monitoring.
All I want to do is see the logs.
I've tried two things:
1) I went to the config file (spark-defaults.conf
) in SPARK_HOME:
spark.eventLog.dir hdfs:<ip>:<port>/<path to directory/
2) set in my python script the following:
conf = pyspark.SparkConf()
conf.set("spark.eventLog.dir", 'hdfs:<ip>:<port>/<path to directory/')
sc = pyspark.SparkContext(conf=conf)
Neither of these seem to produce logs in the folder. Is there anything else that I can try?
Thank you. this is spark 1.3