0

Spark Version: 3.3 I don't set spark.memory.offHeap.enabled. From official document, it means spark.memory.offHeap.enabled=false(default). But in Spark-History UI, i found that "Spark Peak Execution Memory" is not zero. Here offheap means executor overhead cost? Why i could see "offheap" was used here.

gang
  • 1

1 Answers1

0

You could search spark.memory.offHeap.enabled in Spark-History-UI -> Enviroment to see if it is really false. You will get a result if it is true else your search matches nothing.

Or adding --conf spark.memory.offHeap.enabled= to true or false

AppleCEO
  • 63
  • 7