问题
I'm running a hadoop job ( from oozie ) that has few counters, and multioutput.
I get error like: org.apache.hadoop.mapreduce.counters.LimitExceededException: Too many counters: 121 max=120
Then I removed all the code that has counters, and also set mout.setCountersEnabled to false. And also set the max counters to 240 in hadoop config.
Now I still get the same error org.apache.hadoop.mapreduce.counters.LimitExceededException: Too many counters: 241 max=240
How can I solve this problem? Is there any possibility that any hidden counters exists? How can I make clear what counters there before exceeds 240 ? (The process looks like stopped before I can print anything? )
Thanks, Xinsong
回答1:
I solved the problem use the following method: vi $HADOOP_HOME/conf/mapred-site.xml
<property>
<name>mapreduce.job.counters.limit</name>
<!--<value>120</value>-->
<value>20000</value>
<description>Limit on the number of counters allowed per job. The default value is 200.</description>
</property>
回答2:
I got the reason. It because the multioutput, each multioutput by default has a counter. There are more multioutput after my change, so it get exceed error.
来源:https://stackoverflow.com/questions/20899050/org-apache-hadoop-mapreduce-counters-limitexceededexception-too-many-counters