org.apache.hadoop.mapreduce.counters.LimitExceededException: Too many counters: 121 max=120

心已入冬 提交于 2019-12-04 03:29:45

问题


I'm running a hadoop job ( from oozie ) that has few counters, and multioutput.

I get error like: org.apache.hadoop.mapreduce.counters.LimitExceededException: Too many counters: 121 max=120

Then I removed all the code that has counters, and also set mout.setCountersEnabled to false. And also set the max counters to 240 in hadoop config.

Now I still get the same error org.apache.hadoop.mapreduce.counters.LimitExceededException: Too many counters: 241 max=240

How can I solve this problem? Is there any possibility that any hidden counters exists? How can I make clear what counters there before exceeds 240 ? (The process looks like stopped before I can print anything? )

Thanks, Xinsong


回答1:


I solved the problem use the following method: vi $HADOOP_HOME/conf/mapred-site.xml

<property>
    <name>mapreduce.job.counters.limit</name>
    <!--<value>120</value>-->
   <value>20000</value>
    <description>Limit on the number of counters allowed per job. The default value is 200.</description>
</property>



回答2:


I got the reason. It because the multioutput, each multioutput by default has a counter. There are more multioutput after my change, so it get exceed error.



来源:https://stackoverflow.com/questions/20899050/org-apache-hadoop-mapreduce-counters-limitexceededexception-too-many-counters

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!