We are running a Spark job via spark-submit, and I can see that the job will be re-submitted in the case of failure.
spark-submit
How can I stop it from having attem
An API/programming language-agnostic solution would be to set the yarn max attempts as a command line argument:
spark-submit --conf spark.yarn.maxAppAttempts=1
See @code 's answer