1

I am having a stable/spark Helm deployment running on Kubernetes and submitting the job through Livy.

curl -X POST --data '{"className": "LogBundleConfigFetcher", "file": "http:///aliceparser_2.11-19.12.09.jar" ,"args": [""]}' -H "Content-Type: application/json" http://:8998/batches

I am able to see the driver logs in the Livy pod. But not able to see executor logs.

Is there any way I can see the executors log?

Sumit G
  • 436
  • 8
  • 21

1 Answers1

1

Livy API doesn't provide a way to access Spark Executor logs.

I would recommend you to look at the Grafana Loki project for easy logs collection in Kubernetes cluster.

Also Livy can be customized to collect Executor logs, but it'll be a great overhead on scale.

  • We are using Kibana but that is also showing only driver logs. Should I use spark-history server deployment to store the logs somewhere? – Sumit G Jan 17 '20 at 05:01
  • 1
    Spark History Server stores event logs only (the snapshot of the Spark UI, not the STDOUT). Concerning Kibana - the issue is in the way you collect the logs. Probably you need to add some additional configs to your Elastic/Kibana setup to collect executor logs, or check the query you use in Kibana. Unfortunately I don't have ELK setup by my hand, but I can share with you the Prometheus stack I use to monitor Spark Apps in Kubernetes: https://github.com/jahstreet/spark-on-kubernetes-helm/tree/master/charts/spark-monitoring. Hope it helps. – Aliaksandr Sasnouskikh Jan 17 '20 at 09:04