Sorry for the basic question, but i couldn't figured it out by myself.
I was trying to figure out on Spark UI how much memory is available and used on each worker and the driver.
Is there any straight-forward and simple way to monitor this information?
My goal is to decide my persistence strategy, according to how much my data occupy on the workers and on the driver.
P.S. I am using standalone mode, on Spark 1.6.1