How to limit the number of retries on Spark job failure?

后端 未结 4 2186
盖世英雄少女心
盖世英雄少女心 2020-12-09 01:46

We are running a Spark job via spark-submit, and I can see that the job will be re-submitted in the case of failure.

How can I stop it from having attem

4条回答
  •  刺人心
    刺人心 (楼主)
    2020-12-09 02:16

    Add the property yarn.resourcemanager.am.max-attempts to your yarn-default.xml file. It specifies the maximum number of application attempts.

    For more details look into this link

提交回复
热议问题