I have installed spark 2.0.2 prebuitl for hadoop 2.4 and later from here : https://spark.apache.org/downloads.html . than, I have created my cluster composed from 1 master and 2 workers, also, I have installed Ganglia on the 3 machines (gmetad, gmond on master and gmond only on the workers). I need to monitor spark cluster usage of CPU, memory and disk when running a spark application to get the performance of my cluster. My question is how to integrate Ganglia with spark, how to see spark metrics in ganglia web UI? I know that we must configure metric.properties file in $SPARK_HOME/conf to set up ganglia sinks..I did this but I learn here that we must have LGPL packages and this one is not included by default. How install it while I have spark prebuilt. Should I rebuilt spark ? How do it? I have found into the two links below that spark used is built by mvn or sbt but is not same as what I have used ( Spark Pre-built)
Spark Monitoring with Ganglia and How to integrate Ganglia for Spark 2.1 Job metrics, Spark ignoring Ganglia metrics
Thank you