I am currently developing a hadoop program. The program is killed by Hadoop because the mapper tasker takes up high memory (around 7G). Is there a way to let one machine run only one task at a time?
I tried settings shown below but it didn't work. The task was killed by hadoop.
conf.set("mapreduce.tasktracker.reserved.physicalmemory.mb", "7000");
conf.set("mapred.tasktracker.map.tasks.maximum", "1");
The cluster is using mapr-m3
and every machine has 15.6GB memory with 70% availability.