0

I am Facing Error: Java heap space and Error: GC overhead limit exceeded

So i started looking into hadoop-env.sh.

so thats what i understand so far, Please correct me if i am wrong.

if HADOOP_HEAPSIZE=7168 in hadoop-env.sh

this will invoke datanode daemon and tasktracker daemon on datanode with 7GB memory assigned to each(datanode(7GB)+ tasktracker(7GB) = 14GB)

and

mapred.tasktracker.reduce.tasks.maximum = 3
mapred.tasktracker.map.tasks.maximum = 6 and 
mapred.child.java.opts -Xmx1024m

So this will invoke 9 child JVMs with 1GB memory, so total of 9GB

But tasktracker is invoked with 7GB memory, so this wil conflict. as max memory for tasktracker and child JVMS invoked by tasktracker is 7GB, but they are consuming 9G.

So the heap space error occured, is my calculation correct?

user2950086
  • 135
  • 1
  • 1
  • 13
  • referred [http://www.mapr.com/blog/how-to-avoid-java-heap-space-errors-understanding-and-managing-task-attempt-memory#.U6PBj1HxDNz] – user2950086 Jun 27 '14 at 05:35
  • But i also read Total ram = (Mappers + Reducers)* Child Task Heap + DN heap + TT heap + 3GB + RS heap + Other Services heap . SO it says total ram will be 7GB(datanode daemon) + 7GB(tasktracker daemon) + 9GB(Child JVMs) = 25GB + 2GB(left for OS) = 27GB. but i only had 16GB RAM. – user2950086 Jun 27 '14 at 05:46

0 Answers0