When I am running a spark application on yarn, with driver and executor memory settings as --driver-memory 4G --executor-memory 2G
Then when I run the application, an exceptions throws complaining that Container killed by YARN for exceeding memory limits. 2.5 GB of 2.5 GB physical memory used. Consider boosting spark.yarn.executor.memoryOverhead.
What does this 2.5 GB mean here? (overhead memory, executor memory or overhead+executor memory?)I ask so because when I change the the memory settings as:
--driver-memory 4G --executor-memory 4G --conf --driver-memory 4G --conf spark.yarn.executor.memoryOverhead=2048
,then the exception disappears.
I would ask, although I have boosted the overhead memory to 2G, it is still under 2.5G, why does it work now?