I'm under the impression that if I set spark.executor.memory
to 50G, then with this formula and reading this article
The maximum heap will be 30.3G. But the peak JVM onHeap I'm seeing in Spark UI is 39.3 GiB
Can I get some help to explain what is missing from my current understanding?
The values I'm seeing in Spark UI does not match the understanding I got from reading:
- https://spark.apache.org/docs/3.1.2/tuning.html#memory-management-overview
- https://medium.com/walmartglobaltech/decoding-memory-in-spark-parameters-that-are-often-confused-c11be7488a24
I'm also using this following config
spark.executor.memoryOverhead=10G