5

I'm running a hadoop job ( from oozie ) that has few counters, and multioutput.

I get error like: org.apache.hadoop.mapreduce.counters.LimitExceededException: Too many counters: 121 max=120

Then I removed all the code that has counters, and also set mout.setCountersEnabled to false. And also set the max counters to 240 in hadoop config.

Now I still get the same error org.apache.hadoop.mapreduce.counters.LimitExceededException: Too many counters: 241 max=240

How can I solve this problem? Is there any possibility that any hidden counters exists? How can I make clear what counters there before exceeds 240 ? (The process looks like stopped before I can print anything? )

Thanks, Xinsong

lixinso
  • 713
  • 3
  • 11
  • 15
  • 1
    is this post helpful? http://mapredit.blogspot.gr/2012/12/hive-query-error-too-many-counters.html – vefthym Jan 03 '14 at 11:47
  • 2
    I got the reason. It because the multioutput, each multioutput by default has a counter. There are more multioutput after my change, so it get exceed error. – lixinso Jan 09 '14 at 07:43

2 Answers2

3

I solved the problem use the following method: vi $HADOOP_HOME/conf/mapred-site.xml

<property>
    <name>mapreduce.job.counters.limit</name>
    <!--<value>120</value>-->
   <value>20000</value>
    <description>Limit on the number of counters allowed per job. The default value is 200.</description>
</property>
余建新
  • 41
  • 2
0

I got the reason. It because the multioutput, each multioutput by default has a counter. There are more multioutput after my change, so it get exceed error.