How to limit the number of retries on Spark job failure?

后端 未结 4 2175
盖世英雄少女心
盖世英雄少女心 2020-12-09 01:46

We are running a Spark job via spark-submit, and I can see that the job will be re-submitted in the case of failure.

How can I stop it from having attem

4条回答
  •  小蘑菇
    小蘑菇 (楼主)
    2020-12-09 02:18

    There are two settings that control the number of retries (i.e. the maximum number of ApplicationMaster registration attempts with YARN is considered failed and hence the entire Spark application):

    • spark.yarn.maxAppAttempts - Spark's own setting. See MAX_APP_ATTEMPTS:

        private[spark] val MAX_APP_ATTEMPTS = ConfigBuilder("spark.yarn.maxAppAttempts")
          .doc("Maximum number of AM attempts before failing the app.")
          .intConf
          .createOptional
      
    • yarn.resourcemanager.am.max-attempts - YARN's own setting with default being 2.

    (As you can see in YarnRMClient.getMaxRegAttempts) the actual number is the minimum of the configuration settings of YARN and Spark with YARN's being the last resort.

提交回复
热议问题