I tried to do an algorithm(Fast Fourier Transform, FFT) in Spark.
When data is over 64m(Though it is really small.), The console shows the message:
java.lang.OutOfMemoryError: java heap space.
I use 32-bit Ubuntu system and Spark-1.6.0. And I use the Scala IDE, eclipse.
I use local mode and standalone mode. When the data is smaller than 32m, it works well, but when the data is larger than 64m, It cannot work.
I have tried to set the JVM settings to -Xms1000m
and -Xmx4000m
. I also tried to add .set("spark.exetutor.memory", "8g")
when creating the SparkConf
in the code. But both of them do not work.
How can I fix this?