0

what is "spark.executor.memoryOverhead" and "spark.memory.fraction"?

what is the default properties

Community
  • 1
  • 1
suresh c
  • 11
  • 1
  • 3
  • 3
    Possible duplicate of [Why increase spark.yarn.executor.memoryOverhead?](https://stackoverflow.com/questions/49988475/why-increase-spark-yarn-executor-memoryoverhead) – Sach Dec 10 '18 at 05:50

1 Answers1

0

spark.memory.fraction parameter can be used to separately understand memory available for storage and memory available for execution. If you are caching too many objects in memory then you will need more of storage (spark.memory.fraction can be 0.5/0.6). However, if you are using memory for largely execution purposes then you need memory to be available for execution (spark.memory.fraction can be 0.2/0.3).

Clock Slave
  • 7,627
  • 15
  • 68
  • 109
Prashant
  • 702
  • 6
  • 21