How to limit the number of retries on Spark job failure?

后端 未结 4 2184
盖世英雄少女心
盖世英雄少女心 2020-12-09 01:46

We are running a Spark job via spark-submit, and I can see that the job will be re-submitted in the case of failure.

How can I stop it from having attem

4条回答
  •  谎友^
    谎友^ (楼主)
    2020-12-09 02:29

    An API/programming language-agnostic solution would be to set the yarn max attempts as a command line argument:

    spark-submit --conf spark.yarn.maxAppAttempts=1 
    

    See @code 's answer

提交回复
热议问题