When performing a couple of joins on spark data frames (4x) I get the following error:
org.apache.spark.shuffle.FetchFailedException: failed to allocate 16777216 byte(s) of direct memory (used: 4294967296, max: 4294967296)
Even when setting:
--conf "spark.executor.extraJavaOptions-XX:MaxDirectMemorySize=4G" \
it is not solved.