0

I have an application used to put Spark dataframe data into Hive.

The first time, the application use 100 cores and 10 GB of memory producing this OutOfMemory error after leaking a lot of 32 Mb chunks.

this OutOfMemory error

After that I run the application with 100 cores and 20GB of memory obtaining a different leak size (64 Mb) followed by the same OutOfMemory error:

same OutOfMemory error

Can anyone help me understand this behaviour?

Michael Benjamin
  • 346,931
  • 104
  • 581
  • 701
spoon
  • 41
  • 5
  • why don't just increase Memory? did you try monitoring the servers to see how much memory they use? you should also mention how you configured driver and executors memory and cores. this could also point out the problem – Tal Joffe Aug 29 '16 at 06:51
  • Reformatted/rephrased whole question – Vincenzo Maggio Aug 29 '16 at 09:22

0 Answers0