spark on yarn run double times when error [duplicate]

你离开我真会死。 提交于 2019-12-10 06:57:56

问题


I use the model that spark on yarn,when i meet a problem the spark would restart automatic.

I want to run exact once whatever successful or fail.

Is there any conf or api can set?

I'm using spark version 1.5.


回答1:


You have to set spark.yarn.maxAppAttempts property to 1. Default value for this is yarn.resourcemanager.am.max-attempts which is by default 2.

Set the property via code:

SparkConf conf = new SparkConf();
conf.set("spark.yarn.maxAppAttempts", "1");

Set when submitting the job via spark-submit:

--conf spark.yarn.maxAppAttempts=1



来源:https://stackoverflow.com/questions/41606335/spark-on-yarn-run-double-times-when-error

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!