How to limit the number of retries on Spark job failure?

后端 未结 4 2185
盖世英雄少女心
盖世英雄少女心 2020-12-09 01:46

We are running a Spark job via spark-submit, and I can see that the job will be re-submitted in the case of failure.

How can I stop it from having attem

4条回答
  •  夕颜
    夕颜 (楼主)
    2020-12-09 02:25

    but in general in which cases - it would fail once and recover at the second time - in case of cluster or queue too busy I guess I am running jobs using oozie coordinators - I was thinking to set to 1 - it it fails it will run at the next materialization -

提交回复
热议问题