1

I tried to do an algorithm(Fast Fourier Transform, FFT) in Spark.

When data is over 64m(Though it is really small.), The console shows the message:

java.lang.OutOfMemoryError: java heap space.

I use 32-bit Ubuntu system and Spark-1.6.0. And I use the Scala IDE, eclipse.

I use local mode and standalone mode. When the data is smaller than 32m, it works well, but when the data is larger than 64m, It cannot work.

I have tried to set the JVM settings to -Xms1000m and -Xmx4000m. I also tried to add .set("spark.exetutor.memory", "8g") when creating the SparkConf in the code. But both of them do not work.

How can I fix this?

sarveshseri
  • 13,738
  • 28
  • 47
cheng.y
  • 11
  • 3
  • Possible duplicate of http://stackoverflow.com/questions/21138751/spark-java-lang-outofmemoryerror-java-heap-space – jacks Oct 19 '16 at 09:33
  • The problem is solved. I used a 32-bit ubuntu... Change it to a 64-bit system, the problem is solved. – cheng.y Jul 13 '17 at 01:52

1 Answers1

1

if you work with spark.master = local, then the relevant value to adjust is spark.driver.memory. Note that this option needs to be set before the JVM (i.e. the driver) is launched in local mode, so modifiyng the existing SparkContext won't help, because the JVM is already launched.

Raphael Roth
  • 26,751
  • 15
  • 88
  • 145
  • Thank you so much. But I find my problem is that the system is 32-bit. I reinstalled a 64-bit system and the problem is solved. – cheng.y Jul 13 '17 at 01:50