I know there are several ways to monitor storage memory utilization of a Spark application but does anyone know a way to monitor execution memory utilization. I am also looking for a way to monitor the "user memory", that is memory that is not used for execution nor storage memory. Going by Spark's documentation on memory management https://spark.apache.org/docs/latest/tuning.html the memory that is not allocated to M or spark.memory.fraction.
Asked
Active
Viewed 1,203 times
2
-
I want to know that too!! – astro_asz Feb 12 '18 at 15:43